All articles

Top 12 technologies that defined Wolk's 2025 (including a piece of wood)

Every year we ask our team what they're genuinely excited about, and this is what came back this year. These technologies made us faster, happier, and physically stronger last year. Odds are, at least one of these belongs in your toolkit for 2026 too.

1. Best AI pair programmer: Claude Code

Anthropic has been cooking this year. When they announced Claude Code last Spring, we’ve been integrating Claude Code more and more in our workflows. A quick prototype for a customer demo? Done in 15 minutes. Refactor 10 SQL models with 100+ lines of code each into a DRY data warehouse analytics layer? Done in 1 hour. The hard numbers about our coding productivity gains are still out, but the consensus is that we’re at least two times faster with small to medium coding tasks with Claude Code.

Claude Code only became more irresistible to use when Anthropic dropped Sonnet 4.5 and Opus 4.5. Especially with Opus, we could one-shot a minimal requirement prototype of a Python FastAPI or a React front-end 80 percent of the time, without any edits.

Sure, Claude Code or Opus does not replace us as developers, but speeds up our development workflows so much that we’re more productive than ever. We can go to production with our greenfield projects within weeks, and be way more effective when building on existing complex codebases. We don’t know what recipe Anthropic has been cooking, but it sure tastes awesome.

Claude Code in action

2. Best standard for AI integrations: Model Context Protocol

Remember when every AI integration meant building a custom connector? MCP killed that.

Anthropic launched the Model Context Protocol in late 2024, and 2025 was the year it became unavoidable. It's essentially USB-C for AI—a universal standard for connecting models to external tools and data. Build once, connect everywhere.

The adoption has been wild: OpenAI, Google, and Microsoft all jumped on board. Thousands of community-built MCP servers now exist. And just this month, Anthropic donated the whole thing to the Linux Foundation, with backing from basically everyone who matters in AI. When competitors agree on a standard, you know it's important.

For anyone building AI agents that need to talk to the real world, MCP is no longer optional. At Wolk, we have been using the protocol since day 1.

MCP contributions by Wolk

3. Best framework for AI agents: PydanticAI

When building AI agents PydanticAI has been essential for us. These tools are helpful for type safety, structure and validation, especially during development. With Logfire we could check the logs and outputs of the LLMs, see every call and reasoning chain. It’s very visual and easy to understand, plus PydanticAI allows you to evaluate the prompts and tool calls of the agents, and then visualize the results in Logfire interface.

Additionally, we recently started using PydanticAI Gateway, which is extremely useful when working with AI models. Instead of managing API keys for all the models you use in your project, with Gateway it’s possible to have a single key that handles everything. This is very helpful in case of switching from one model to another, or when using several models for different types of agents. Esther wrote about how she used it in her projects earlier this year.

An agent architecture with PydanticAI, by Esther

4. Best self-hosted AI: Mistral 3

Self-hosting AI models used to be a "sure, if you have a spare GPU cluster lying around" kind of thing. Not anymore.

Mistral AI released Mistral 3: a family of multimodal models, fully Apache 2.0 licensed. That means free to use, modify, and deploy commercially—no strings attached. The 3B model runs entirely in your browser via WebGPU. No API calls. No cloud. No data leaving your machine.

For organisations that care about data sovereignty but don't want to spend a lot of money on GPU infrastructure, this is huge. Pair it with tools like Ollama or vLLM and you can self-host on your laptop or European cloud infra without breaking a sweat.

Local inference just got very accessible. And very cool. Stijn tried it out.

5. Best data stack for small data: DuckDB

Your data stack is probably over-engineered. Kafka, Spark, Databricks, Snowflake, you name it. And how much data do you actually have? 50 gigabytes?

Redshift found out that 99.5% of queries can run on a laptop. DuckDB makes this practical: no server, no configuration, just pip install duckdb and query your Parquet files directly from S3.

Nathan wrote a whole piece on the small data movement. Read it before your next architecture decision.

6. Best hardware for large-scale AI: TPU's

Efficient deep learning workflows demand specialized hardware, a domain historically led by NVIDIA's GPUs (Graphics Processing Units). However, Google's TPUs (Tensor Processing Units) are rapidly gaining traction, with the latest architecture Ironwood TPU introduced in November that powers Gemini 3, posing a significant challenge to this dominance. The key differentiation lies in design intent: GPUs offer a general-purpose solution, providing superior flexibility, ecosystem maturity (leveraging CUDA and extensive cloud/framework support), and versatility for general research, dynamic models, and low-latency inference. On the other hand, TPUs are specialized ASICs (Application-Specific Integrated Circuit), utilizing a systolic array to maximize throughput and power efficiency for matrix/tensor multiplication, which is the core of modern massive deep learning applications. This makes them exceptionally cost-effective and scalable for training the largest models and managing high-volume, production-scale inference. With more companies following Google’s lead by adopting TPUs as their preferred architecture, Caner is expecting to see a competition and perhaps a shift in dominance in the deep learning hardware platform in the upcoming years.

7. Best Python package manager: uv

If you've ever lost an afternoon to Python dependency conflicts, uv is your new best friend. This Rust-powered package manager from Astral is blazingly fast—we're talking 10-100x faster than pip—and finally brings sanity to Python's notoriously messy package management.

Nathan wrote about this one in detail earlier this year, but it bears repeating: uv has fundamentally changed how we set up Python projects.

8. Best JavaScript runtime: bun

Bun is the JavaScript runtime we didn't know we needed. It's Node.js and npm, but with batteries included. No more configuring build tools, just write TypeScript and hit go. A good tool is one that gets out of your way and lets you focus on what's important. Bun does just that. It just works, and it works fast!

9. Best escape from Big Tech: Linux desktop

Was 2025 finally the year of the Linux Desktop? It came dangerously close.

Windows 10 support ended in October. Windows 11 now shows ads in the Start Menu, requires a Microsoft account, and shoves AI chatbots everywhere. Apple? Also pushing services and ads in your System Settings. Big Tech wants your attention and wallet—inside your own operating system.

Enter Valve. The Steam Deck's success proved Linux gaming is real, with Proton making most games just work. And Omarchy (by DHH of Rails fame) made Linux plug-and-play for productivity. The technical excuses are gone.

Maybe 2026 is finally the year. At Wolk, we're already there: Cas is using Arch, btw.

10. Best meeting recording app : Granola

Recording apps are popping up everywhere. We tried out a few. Notion has a fine one, but Granola is currently our favourite. It proved to be great in wrapping up team meetings into clear to-do's, that sometimes can be immediately executed by an AI.

11. Best no-code database for teams: Airtable

With this tool even our non-technical team members are able to easily customize the features and automations our team needs. Their motto: everything is a table.

Finding the right tool to wrangle our capacity and plan projects beyond next Tuesday? Let's just say it's been a journey. After many trials, Anna eventually landed on Airtable.

12. Best low-tech for in the office: Beastmaker 1000 series

Yes, a wooden hangboard made our top 12 tech list. This thing hangs in our office, and we use it daily. Stuck on a bug? Hang from the Beastmaker. Waiting for a deployment? Hang from the Beastmaker. Need to think through an architecture decision? You guessed it.

It turns out that grip strength is an important biomarker of health, and is highly correlated to longevity. Whether there is a causal relationship remains to be seen. In the meantime, we don’t take any chances.

Jelle is showing off again..


Stay up to date!

Subscribe to our newsletter, de Wolkskrant, to get the latest tools, trends and tips from the industry.

Subscribe