Confluent crowns the new AI kingmaker, real-time data in context

Confluent crowns the new AI kingmaker, real-time data in context

Data streaming platform company Confluent staged its annual user and partner practitioner conference in New Orleans this week. It was perhaps logical to assume that Confuent Current 2025 would focus not just on the application of data streaming and real-time information throughput and analysis; the central mission this year was always going to align towards how all strains of predictive, generative and agentic AI now need real-time data feeds in order to act upon data in context, as it happens… in the moment.

Confluent co-founder and CEO Jay Kraps hosted a welcoming keynote entitled ‘Building Intelligent Systems on Real-time Data’. On a mission to validate why he suggests that “data streaming is becoming ubiquitous” across all business verticals, Kreps wants users to think about adopting a shift-left approach to data management. In practical terms, that means left-shifted (i.e. early) actions to process and govern at the source, then reuse everywhere.

With the penetration and impact of artificial intelligence so pervasive and having applicability to every industry vertical, it was perhaps no surprise to hear Confluent CEO Kreps focus on where real-time data will drive automation intelligence next. Talking about streaming progression, Kreps says we need to “connect AI into the software estates” that we run and operate inside enterprise organisations today.

“If perhaps 10 years ago we were focused on getting data into a data lakehouse and worrying about reporting and analytics, today being truly data-driven as a business means being able to take action on behalf of customers and users. Ultimately, AI systems are a really different world… while it’s easy to ask ChatGPT (or other AI engines) some simple questions, it’s so much harder to build truly functional products and services out of these services. With fantastically complicated business logic to navigate, we need to remember that AI underpins probabilistic systems that might offer perhaps 90% accuracy,” said Kreps.

A new kingmaker for business

He defines this percentage-based distinction because, previously, enterprise software could be run at a 100% ‘pass rate’ in terms of execution functionality efficiency. In other (simpler) words, traditional enterprise software can run deterministic logic to ensure 1 +1 equals 2, every time. In those traditional software systems, business logic is king… in AI systems, the context of the data being used is king. Today, we can build systems based on non-deterministic logic that provide probabilistic AI results that ensure 1 + 1 equals 2, although 10% of the time it might equal 1.9 or 2.2 and so on.

When AI is built, the model used is down to a relatively small number of companies (Anthropic, Google, ChatGPT etc.) but the context data we use to feed those engines is spread across a wide number of places around a business and throughout its connection points (to machines, to APIs, to other applications and to other users), so working out where the value point with enterprise AI today is can be tough.

“A lot of the data that we need for modern business functions is hidden behind bad APIs and models can not intuitively know how to harness the right information to get around problems. To build a new ‘derived’ dataset (via context engineering) means being able to combine data capture, data processing and queries down the complete data pipeline that a business operates with,” said Kreps.

So where does batch processing sit in this wild new frontier?

Actually, batch data is something that Confluent says it is capable of working with in a very agile manner. Batch data capture and batch data processing (in a data lakehouse and warehouse) might come through into a system in maybe four hours, but it’s often overnight. It still has a data context value for AI, but it is historical and not in-the-moment.

How to cross the road with AI

“If you were going to cross a business dangerous street and all the information you had was a photo of where the cars were yesterday, you’d naturally feel pretty nervous about stepping off the sidewalk,” said Kreps, providing an example of why we need to have streamed real-time information, in context, if we are going to make decisions in-the-moment, in critical environments.

Four people stand at a crosswalk waiting to cross a city street, with a "WALK" signal lit and blurred cars passing by.

So then, how do we unify batch and streaming? Confluent has been doing this with Tableflow, a technology that unifies streams as open table formats in object storage. This means an organisation can process historical data and then move onwards to also work with streaming data. Apache Flink then makes streaming “a generalisation of batch” to complete the way the data supply chain works. There are still challenges out there i.e. data model mismatch can throw systems off balance, but the efforts from Confluent to connect and enable new world of agentic AI services with real-time data is certainly apparent and clear.

Since its debut, Tableflow has transformed how organisations make streaming data analytics-ready. It eliminates brittle ETL jobs and manual Lakehouse integrations that slow teams down. With Delta Lake and Unity Catalog integrations now in GA and Tableflow support for OneLake, Confluent is expanding its multicloud footprint. These updates deliver a unified solution that connects real-time and analytical data with enterprise governance. Now, it’s easier to build real-time AI and analytics that propel businesses ahead of their competition.

“Customers want to do more with their real-time data, but the friction between streaming and analytics has always slowed them down,” said Shaun Clowes, Chief Product Officer at Confluent. “With Tableflow, we’re closing that gap and making it easy to connect Kafka directly to governed lakehouses. That means high-quality data ready for analytics and AI the moment it’s created.”

Streaming agents: enterprise-grade governance

Moving to detail the company’s work in streaming agents, Kreps explained his company’s new Streaming Agents advancements that make it easier to build and scale event-driven artificial intelligence (AI) agents. With the new Agent Definition, teams can create production-ready agents in a few lines of code. Built-in observability and debugging give teams confidence to quickly move from projects to real-world use cases with replayability, testability and safe recovery. Confluent’s Real-Time Context Engine provides fresh context with enterprise-grade governance so organisations can bring trustworthy AI agents to market faster.

Because building agentic AI is hard. AI agents power scalable generative AI adoption that drives business transformation, but many organisations face challenges with governance and data complexity. Connecting data is hard, failures are tough to troubleshoot and brittle monoliths are unscalable. For enterprises, the stakes are even higher. They need systems that can respond in real time, yet most AI today can’t act on critical events without human intervention.

Streaming Agents brings the stream processing strengths of Apache Flink – scale, low latency and fault tolerance – together with agent capabilities like large language models (LLMs), tools, memory and orchestration. Because Streaming Agents lives directly in event streams, it monitors the state of a business with the latest real-time data. This produces enterprise AI agents that can observe, decide and act in real time, without stitching together disparate systems. Streaming Agents brings data processing and AI together in one place so teams can finally launch AI agents that are always-on and ready to act instantly.

People (and AI) need prompting

“Today, most enterprise AI systems can’t respond automatically to important events happening in a business without someone prompting them first,” said Sean Falconer, head of AI at Confluent. “This leads to lost revenue, unhappy customers, or added risk when a payment fails or a network malfunctions. Streaming Agents brings real-time data and agent reasoning together so teams can quickly launch AI agents that observe and act in real time with the freshest, most accurate data.”

Coming full circle then, what about good old fashioned machine learning? Kreps says it’s true that older ML models don’t have the breadth of modern agentic services. But, he notes, they’re really fast and also cheap to run. So could there be a place today for this technology? Yes he notes, in anomaly and fraud detection in behaviour analysis, this stuff still works well. Because of this, Confluent has built anomaly detection and forecasting into its wider platform offering under the Confluent Intelligence product banner.

Fireside chat: Anthropic applied AI lead

This event’s keynote moved to a fireside chat between Sean Falconer, senior director of product management at Confluent and Rachel Lo, head of applied AI at Anthropic. Talking about where we are right now with AI, Lo said that we’ve moved from through and beyond the chatbot era to the start of this year, when it was still all about quite linear use i.e. straight questions and straight answers.

“We’re now looking at where we can embed AI into workflows so that teams across an organisation can be more expansive as AI itself works across different lines of business in functional, productised ways,” said Lo. “Recognising that multi-layer agentic architectures are now our objective is not just about context engineering, it’s about organisations thinking about where data resources are really critical to their business and how agentic services can be built. No matter how intelligent a model an IT team might build, it is the data foundation that really forms the critical piece of this conversation.”

Lo says that his firm’s most sophisticated customers realise the patterns and nuances that exist in data outside of structured data resources. Because so much unstructured data (that is very often real-time) has so much value to businesses when used in context, the smartest firms are realising how important it is to embrace a multi-layer agentic architecture as AI now shifts to more real world operational use cases.

Confluent: Real-Time Context Engine 

Looking at other parts of the ‘news roster’ at this year’s Confluent Connect, we can see that the company announced its Real-Time Context Engine. This is a fully managed service using Model Context Protocol (MCP) and designed to deliver real-time, structured data and accurate, relevant context to any artificial intelligence (AI) agent, copilot, or LLM-powered application. Open standards like MCP have made it easier to connect enterprise data to AI agents and applications, but they only solve part of the problem. The data it exposes often remains raw, fragmented and inconsistent, while data lake batch processing pipelines add layers of complexity and delay. 

The result, suggests Confluent, is AI that reacts to the past with no understanding of the present. What’s needed is an architecture that unifies both, continuously processing and serving accurate, trustworthy context so AI can make reliable and trustworthy decisions.

“AI is only as good as its context,” said Confuent’s Falconer. “Enterprises have the data, but it’s often stale, fragmented, or locked in formats that AI can’t use effectively. Real-Time Context Engine solves this by unifying data processing, reprocessing and serving, turning continuous data streams into live context for smarter, faster and more reliable AI decisions.” 

Amorphous abilities for future AI 

Speakers throughout this conference agree on one thing – the AI conversation is changing. We’re moving on from talking about prototypes, we’re moving on from talking about linear deterministic use cases of AI… and we’re progressing to talk about where multi-layer agentic architectures can work with real-time data in context to support use of unstructured data, often for far more non-deterministic queries and use cases. 

Because organisations today have amorphous problems with many variables and many data streams, they need real-time data in context to underpin agentic workflows that deliver productive AI services that actually change businesses… and perhaps even change real people’s lives too.

Large white "Current" sign decorated with colorful beads, feathers, and plants is displayed indoors on green artificial grass, under purple and blue lighting.