Data Streaming’s Importance in AI Applications

PinIt

Confluent’s Current conference highlighted the critical role of data streaming in addressing the real-time data needs of AI.

Confluent’s annual Current conference held last week in Austin, Texas, like past conferences, included two days of sessions with real time as the main topic. The theme throughout the event was centered on data streaming and technologies such as Apache Kafka and Flink. The elephant in the room this year was how to handle the real-time data needs of AI.

Confluent CEO and Co-founder Jay Kreps kicked off the conference, putting the issue front and center. In a keynote address titled “Data Streaming in the Age of AI,” he and his guests talked about the next game-changing evolution of data infrastructure that effectively leverages AI and ubiquitous automation.

The session discusses how companies are becoming software and AI-driven and the importance of building a robust data streaming platform to enable AI applications and autonomous agents within organizations.

One key point he and others highlighted in the keynote is that companies are transitioning from relying on what was described as bureaucratic processes to becoming software-driven with the rise of cloud computing and increasing adoption of software systems across the business.

They further noted that as AI becomes increasingly powerful and pervasive, companies need reliable, real-time access to high-quality data to power AI-enabled applications and autonomous agents that can take action on behalf of the business.

To do that, companies need a data streaming platform based on technologies like Apache Kafka and Flink. Such a platform is critical to providing the real-time data access and processing capabilities required for today’s enterprise AI applications.

During the session, Kreps brought a number of industry-leading organizations to the stage to talk about these issues and how they are handling them. Speakers from Mercedes-Benz, Accenture, and Viacom talked about how they have built enterprise-wide data streaming platforms to support diverse AI use cases and drive business impact.

See also: Real-Time Data Streaming Delivers, and the Data Finally Shows It

A focus on the real-time data needs of AI

Kreps spent some time talking about the relationship between AI, data streaming, and real time.

“ChatGPT or similar systems are trained on static data from the Internet,” he said. “That makes them like a well-informed friend who reads a lot of Wikipedia.” He noted that such a friend is pretty useful, but “you can’t really drop them into the middle of the business and have them take on some important job and suddenly start carrying it out.” They do not have that capability. They don’t know anything about the business or its needs.

What’s needed for enterprise AI usage is something where a company can use data about its customers, sales, procedures, and more to shape the operation of the business. For that to happen, the data has to be relevant to the specific application use case. That data has to be up-to-date and instantly available.

To illustrate the importance of empowering AI with real-time data, he gave a very common example occurring in businesses today. He noted that many businesses are using AI chatbots to assist customers. Relying on static information from the Internet can help with basic FAQ types of queries. But customers often need and want more. For example, bank customers asking a chatbot for their balance do not want to know what the average customer balance is or what their balance was months ago. They want to know their checking account balance right now.

He cautioned that simply opening up real-time data to AI is not a simple proposition. There needs to be access control, especially in regulated industries. And there needs to be an infrastructure in place that can work with real time data at scale across the organization. That infrastructure must ensure that the data used by AI applications is continuously up to date.

Technology to support modern data streaming needs

To address these issues, Confluent introduced new capabilities to Confluent Cloud to make such stream processing and data streaming applications more accessible and secure. For example, Confluent’s new support of Table API makes Apache Flink available to Java and Python developers; Confluent’s private networking for Flink provides enterprise-level protection for use cases with sensitive data; Confluent Extension for Visual Studio Code accelerates the development of real-time use cases; and Client-Side Field Level Encryption encrypts sensitive data for stronger security and privacy.

Additionally, the company announced the Confluent OEM Program. The new program for managed service providers (MSPs), cloud service providers (CSPs), and independent software vendors (ISVs) makes it easy to launch and enhance customer offerings with a complete data streaming platform for Apache Kafka and Apache Flink. The Confluent OEM Program alleviates the burdens of self-managing open-source technologies while going beyond just Kafka and Flink. MSPs and CSPs can easily deliver a complete data streaming platform through Confluent, providing a hassle-free solution for unlocking more customer projects across AI, real-time analytics, application modernization, and more.

Les Yeamans

About Les Yeamans

Les Yeamans is founder and Executive Editor of RTInsights and CDInsights. He is a business entrepreneur with over 25 years of experience developing and managing successful companies in the enterprise software, financial services, strategic consulting and Internet publishing markets. Before founding RTInsights, Les founded and led ebizQ.net, an Internet portal company specializing in the application of critical enterprise technologies including BPM, event-driven architectures, and event processing. When ebizQ.net was acquired by TechTarget, Les became Associate Publisher, managing a group of websites. Previously, Les had founded a new enterprise software business called ezBridge which provided fault-tolerant, guaranteed delivery transaction messaging on 10 different hardware platforms. This product was licensed to IBM as the initial code base for IBM MQSeries (renamed WebSphere MQ and later renamed IBM MQ) which was co-developed and co-marketed with IBM. Les was also co-founder of the Message Oriented Middleware Association (MOMA). Les has worked extensively as an analyst and consultant for end users and vendors in this growing market. Prior to ezBridge, Les raised venture capital for development and marketing of PowerBase, the industry-leading database software package for the IBM PC. He started his career consulting at Accenture, providing end-user IT solutions. Les has an MBA from the University of Michigan and a Bachelor's degree from the State University of New York at Binghamton. He is based in New Rochelle, NY.

Leave a Reply

Your email address will not be published. Required fields are marked *