19 March 2026 · Nick Finch

AI-First, and Last

Two real agent systems. Both built by AI, not just powered by it. Why the organisations that use AI to build will outpace those that only build AI.

AI Agentic AI AI Coding Agents Claude Code Data MCP OpenEdge

I gave a talk yesterday to an audience of application developers. People who build and maintain business-critical systems, most with decades of domain expertise embedded in their databases and business logic. The subject was getting started with AI agents, and we walked through two systems we have built at inmydata over the past few months. A green coffee buying expert system for TikTak Segafredo, and an OpenEdge DBA expert system for White Star Software.

They came expecting to hear about AI agents. They got that. But the thing that landed hardest was not what we built. It was how we built it.

The data is the magic

Both systems are impressive in what they do. The TikTak system gives coffee buyers access to around 23 data subjects spanning ERP procurement data, commodity futures, currency markets, CFTC speculator positioning, NASA weather data from 23 coffee-producing countries, ENSO climate forecasts, and Brazilian government harvest estimates. It synthesises all of that into actionable buying recommendations, either on demand through a chat interface or proactively through automated daily monitoring.

The WSS system captures the expertise of some of the most knowledgeable OpenEdge DBAs in the Progress ecosystem. We have built the hybrid RAG pipeline and a voice-enabled chat agent so the WSS team can test it, challenge it, and build confidence in the quality of its answers. The rollout is deliberately phased. Internal testing first. Then customer-facing, once the experts are satisfied it is reliable. And ultimately, when a sufficient level of trust has been established, allowing the agent to take actions on a customer’s behalf. Each step earns the next.

In both cases, the AI model is almost commodity. You could swap Claude for GPT or whatever comes next and the systems would still work. What you cannot easily replicate is the data. Twenty years of procurement history in an ERP. Tacit knowledge from expert interviews. Weather patterns across coffee-growing regions. Speculator positioning data that signals market reversals. The model interprets. The data is the magic.

That message resonated. But it was the next part that people wanted to talk about afterwards.

The systems were built by AI too

Not just powered by AI. Built by it.

Commodities trading data from a public source that feeds the TikTak system is collated daily using the Claude API. Claude reads the data sources and builds structured data tables from it. The weather data pipeline pulls from NASA’s POWER API using code that was written by a coding agent. The Brazilian harvest estimates are extracted from a government spreadsheet by Claude. The ENSO climate forecast data is pulled from a PDF by Claude. Even the core procurement business logic, the “Volume to Fix” calculation that explodes finished product demand through the Bill of Materials and nets off stock and in-transit volumes, was written by AI.

The expert interview platform that captures tacit knowledge from domain experts? Built with coding agents. The hybrid RAG pipeline for the WSS system, combining vector search, lexical search, and SQL retrieval with deterministic chunking and RRF fusion? Built with coding agents. The voice agent interface? Built with coding agents.

When we need to ingest data from a messy government PDF, we do not write a custom parser. We point Claude at it. When we need business logic to calculate net purchasing requirements, AI writes the code. When we need a retrieval pipeline that combines three different search strategies, we plan it with an LLM and build it with a coding agent.

This is the AI-first mindset, and it is the thing I most wanted that audience to take away.

AI-first means using AI to learn, plan, and build

There is a common misconception that being AI-first means building an AI product. It does not. It means using AI at every stage of the work.

When we started the TikTak project, the first thing we did was sit down with Claude Opus and work through what a green coffee buying decision actually requires. What data sources exist. How commodity futures markets work. What speculator positioning data tells you about market sentiment. How ENSO cycles affect coffee yields in Brazil and Vietnam. We were not experts in coffee procurement. The LLM helped us become conversant quickly enough to have productive conversations with the actual experts, and to design a data architecture that made sense for the domain.

That learning phase is undervalued. People are very aware AI helps you write code faster. It does. But it also helps you understand domains faster, explore architectural options, pressure-test design decisions before you commit to them. The planning conversation we have with models before writing a single line of code is as valuable as the code itself. Every architectural choice surfaced and discussed. Every integration point identified. Every deployment consideration addressed upfront.

When we moved to the WSS project, the same pattern applied. We needed to understand how hybrid retrieval pipelines work, how to combine vector similarity with lexical BM25 scoring, how Reciprocal Rank Fusion merges results from different retrieval strategies. We learned with an LLM, planned with an LLM, and built with a coding agent. The result is a production system with deterministic chunking, LLM-refined document processing, Voyage AI embeddings, and a retrieval architecture that outperforms naive RAG significantly.

Could we have built these systems without AI tools? Technically, yes. In two years instead of four months. The speed advantage is not primarily about typing faster. It is about compressing the entire cycle. Learning, planning, building, testing, documenting. All of it happens faster because you have an intelligent collaborator at every stage.

What this means for your data

That audience was sitting on exactly the kind of deep, domain-specific business data that agents need. OpenEdge databases with decades of encoded business logic. Tax rules, approval chains, compliance requirements, production planning, inventory management. That is not a liability in the agentic era. It is critical infrastructure.

The opportunity is not to panic about AI replacing what you have built. It is to make what you have built accessible to agents. Expose the data through APIs. Build MCP servers. Position those systems as the authoritative source that agents call into. The agent handles the natural language, the synthesis, the presentation. The OpenEdge database provides the ground truth.

But here is the part that matters most. You do not need to wait until you are ready to build the agent before you start preparing.

Three things you can do now, regardless of where you are in your AI journey.

Start by capturing what your experts know. Every organisation has institutional knowledge locked in people’s heads. That is both a risk and an opportunity. The risk is that when those people leave, the knowledge goes with them. The opportunity is that if you capture it now, you can scale it with agents later. We built a voice-based expert interview system specifically for this purpose. You do not need something that sophisticated. Even structured documentation is a start.

Audit your data. Where does it live? How would an agent access it? Is it clean, structured, documented? The organisations that move fastest with AI will be the ones that have their data house in order. This is work you can start immediately and it pays dividends regardless of which AI tools you eventually use.

Be AI-first in how you work today. Use AI to learn about the domains you are working in. Use it to plan your architectures. Use it to write your code, your documentation, your tests. The first project you build with AI teaches you more than six months of reading about AI ever will. And when you are ready to build something for your customers, you will already have the muscle memory.

The cost of waiting

I opened the talk with the two trillion dollars that evaporated from enterprise software market caps in thirty days at the start of 2026. Atlassian down 35%. Salesforce down 28%. The market pricing in what agentic AI does to per-seat subscription models.

That happened fast. The pace of change in AI is unlike anything our industry has seen. Six months from now, the tools will be better, the models will be cheaper, and the organisations that started today will be six months ahead.

You do not need to transform everything. You need to find one opportunity, scope it tightly, and start. The data you already have is enough to begin.

Want to discuss this?

We're always happy to talk about AI, data, and what it takes to ship real systems.

Get in touch