Skip to content
AI & Strategy · March 16, 2026 · 7 min read

Everyone Can Build Now. But Not Everyone Has Something to Say.

Why the real value of AI tools lies not in building, but in the data you hold

Illustration for article: Everyone Can Build Now. But Not Everyone Has Something to Say.

Over the past few months I have built more tools than in the previous twenty years combined. Without a team. Without a development budget. Just me, a laptop, and Claude Code and Cursor.

Vibe coding. It works. I build scanners, dashboards, monitoring systems and assessment tools. As someone who had never written a single line of code that went into production. It feels like a superpower.

But there is a catch. And that catch is barely mentioned by anyone.

The promise sounds simple

The pitch for vibe coding is seductive. You have an idea. You describe what you want. The AI writes the code. You deploy to Vercel or another hosting platform. Done. Tool live. Next idea.

For a certain type of tool that (opent in nieuw venster) story holds up. Want a neat calculation tool? A quiz? A converter? You can be done within an hour. The AI generates perfect React components, writes flawless CSS, and delivers something that a year ago would have required a team of three developers.

But the moment your tool needs to do something useful with current information, the story changes.

The moment it gets expensive

I noticed it when I built a stock screener. The idea was simple: display shares that meet certain valuation criteria. The problem: that price data has to come from somewhere.

So you go to a financial data API. And then you see the prices. One hundred euros a month. Two hundred. Five hundred. For the really good data with historical fundamentals you are looking at a thousand euros a month. And that is only the data. You still have to process, store and serve it.

The same applies to weather data, real-time traffic information, medical databases, legal sources. Everything that must be current and reliable costs money. Often a great deal of money.

And that is the point where most vibe coders stop. They have a brilliant idea. They can build it. But they cannot feed it.

The second wall: AI also costs money

Then there is a second cost that is often overlooked. Many tools use an LLM under the bonnet. A chatbot, an analysis function, a summarisation tool. All very well. All running on an API from OpenAI, Anthropic or Google.

And those APIs are not free.

Every time a user asks a question, you pay. Every analysis costs tokens. Every summary costs money. With ten users a day you hardly notice. With a thousand users a day you do.

I experienced it myself. A tool I built that assesses a damage claim form via the Claude API. Works beautifully. But every assessment costs tokens. And when that runs to thousands per month, it adds up.

AI makes building cheap. But running it is another matter.

The third challenge: LLMs do not know everything

There is something else. The large language models are clever, but they are not current. They have a knowledge cutoff. And you notice that cutoff at precisely the moment you are building a tool that needs to provide reliable, up-to-date information.

I work in the insurance world. Try asking an LLM what the current policy terms are for a specific insurer. Or what the premium is for a van insurance policy with a particular claims history. The model does its best. It gives you an answer that looks credible. But it is not reliable.

And that is dangerous. The output looks convincing. You have to already be an expert to see that the answer is wrong.

The real moat: your own data

And this is where it gets interesting. Because there is a category of builders for whom this story looks very different.

These are the people and companies that have their own data.

If you have twenty years of customer data (opent in nieuw venster). If you have thousands of policy terms in a database. If you have built up historical claims data, conversion rates, market intelligence or specialist knowledge. Then the entire equation changes.

Because then you do not need to subscribe to an expensive API. You are the source.

And here is the beautiful part: with a vector database you can make that proprietary data searchable and usable for AI. You build your own knowledge base that an LLM can consult. Not the general knowledge of the internet. But your specific, current, reliable data.

That takes time to build. True. You need to clean your data, structure it, create embeddings, set up a retrieval system. That is not something you do in an afternoon.

But once you have it, you have something that no one else can replicate by doing a spot of vibe coding.

The new dividing line

I see a new dividing line emerging in the world of AI tools.

On one side you have tools that anyone can build. Beautiful interfaces, smart UX, powered by public APIs and generic LLM knowledge. Those tools become a commodity. There will soon be thousands of them. They resemble each other. They cost the same. They compete on design and marketing.

On the other side you have tools built on unique data. On domain knowledge that cannot be googled. On years of accumulated expertise that is now, for the first time, being made scalable by AI.

That second category is where the value lies.

The paradox of vibe coding

And so a paradox emerges.

Vibe coding democratises building (opent in nieuw venster). Everyone can now make a tool. The technical barrier is gone. That is fantastic, and I am genuinely enthusiastic about it.

But precisely because everyone can build, value shifts. From technical skill to substantive knowledge. From being able to code to having something worth coding about.

The question is no longer: can you build it?

The question is: do you have data that is worth building for?

What does this mean for you?

If you run a business, look at your data differently. That customer database nobody bothers with. Those Excel sheets with ten years of market intelligence. That internal knowledge base gathering dust.

That is not deferred maintenance. That is your future advantage.

Start structuring it. Look at vector databases. Experiment with retrieval augmented generation. You do not need to be an AI expert. With today's tools you can build this step by step.

And if you are a professional who has spent twenty years in a particular field? Then you have something no LLM has. Current, specific, domain-bound knowledge. Making that knowledge scalable with AI is perhaps the best investment you can make this year.

The tools to build are there. The AI to make it intelligent is there. The only thing that cannot be copied is you. And your data.

That is your moat.