Sponsored by Volt Active Data

What Does Real-time Mean in Today’s World?

PinIt

Today, decisions need to be made in milliseconds and need to be made as soon as an event has occurred. Increasingly, there is a need for real-time analytics of data at its source, as it is being generated.

Businesses today must be able to act on data as it is generated. They need real-time insights to make instantaneous decisions; otherwise, they may lose an opportunity or suffer the consequences of delayed action.

RTInsights recently sat down with Dheeraj Remella, Chief Product Officer at Volt Active Data, to talk about what real time actually means today, why real-time operations are hard to achieve, which technologies can help, and more.

Here is a lightly edited summary of our conversations.

RTInsights: Can you define what companies mean when they say they want to make real-time decisions?

Remella: This is an interesting question because there are multiple angles through which one can define real-time. It all comes down to the use cases. Are the decisions made as soon as possible from when a significant event has occurred (ingest time) or when a person asks the data layer a question (query time), and the answer helps the human decide? Are the decisions made by humans or processes or machines? The answers to these questions will change what latency we are talking about, given the situation. For example, a data lake collects data over minutes, hours, weeks, months, or even years. Now, a business analyst wants to make a real-time decision based on a query. This could mean that the analyst’s query needs to be answered within a few seconds.

On the other hand, more and more organizations are looking for real-time decisions to help them automate their processes to:

  • Provide a better or personalized customer experience.
  • Ensure fraud is prevented before it happens rather than detect it after it has already happened.
  • Eliminate human error in responding to events.

In these cases, the decisions need to be made in milliseconds and need to be made as soon as an event has occurred. Increasingly, these decisions are not based just on a static set of rules, but enterprises are looking to incorporate their machine learning insights into making them.

RTInsights: Why is real-time so hard to achieve?

Remella: Enterprises struggle with achieving real-time primarily because they try to retrofit technologies, and they have chosen to address previous-generation applications and requirements. These technologies, while individually performant, only address a part of the real-time decision-making picture. So, they need to be composed together. However, the challenge with this approach is that the data architecture becomes very chatty.

There will be a lot of going back and forth between these layers, adding latency while congesting the network, thus constraining the overall performance. In addition to the latency degradation, which is counter to the real-time objectives of the efforts, to begin with, there will be unnecessary complications around business continuity planning/management and resiliency cost due to the need to ensure each of these layers is failure-proofed.

RTInsights: What technologies are needed to support real-time?

Remella: To build on the previous answer, we must look at the needs as capabilities instead of technologies. To determine the capabilities, let us look at what needs to happen for what we can consider real-time decision-making. The data needs to be ingested, stored, and aggregated for KPIs. Then, any deviation in any of the KPIs for a given process needs to be detected.

If this deviation is above the levels that would be considered normal, decisions need to be made on what the deviation means and what preventative or monetizing actions need to be taken. This decision then gets signaled to a downstream system acting on that signal. To perform these activities, typically, the following four capabilities need to come together:

  • Streaming/Client API-based ingestion
  • Fast storage layer, preferably an in-memory database
  • Stream processing/business rules
  • Streaming aggregation for before and after comparison of KPIs

RTInsights: How does Volt Active Data help?

Remella: Volt Active Data combines all the four capabilities mentioned above into a single unified real-time data platform built ground-up for maximum performance in terms of speed, scale, and latency without compromising on the consistency and accuracy of the data. The platform includes:

  • Real-time Streaming Ingestion: Essential for quickly moving data to the appropriate processing layer.
  • In-memory Data Stores: Enable fast access to contextual data correlating to the incoming event information. Volt’s database is fully configurable for maximum production resiliency, including durability, high availability, and cross-datacenter replication.
  • Stream Processing/Business Rules Optimized with AI and Machine Learning: Our Java Stored Procedures not only run codified Java logic but can also be used to evaluate ML Models
  • Streaming Aggregation (Materialized Views): This is necessary for getting the before and after versions of KPIs that can then, in an event-driven manner, be used for driving decisions.

RTInsights: Can you talk about use cases?

Remella: We have use cases in all three types of applicability, as discussed previously. For example, in hyper-personalized offer management, our customer increased their offer acceptance rate by 253%, while a credit card fraud management customer decreased fraudulent transaction completion by 83%.

The key to these success metrics was that they reduced their business process latency by over 75%, from over 200 milliseconds to less than 10 to 30 milliseconds, by selecting Volt as their unified real-time data platform. In addition, they optimized their data center resources since the amount of hardware needed for the performance while lowering the latency decreased by 90%. Volt is an engine that combines data from multiple sources to apply complex business logic and ML model inferencing to drive decisions and actions in the lowest business process latency. You can put a variety of skins on this foundational value wireframe. It varies from enterprise to enterprise and use case to use case.

Additional Resources

Real-time Data Processing at Scale for Mission-critical Applications (Blog Post)

Intelligent Manufacturing with Real-Time Decisions (Analyst Report)

How Volt Meets Mission-Critical Application Requirements: The Volt Active Data Architecture

Salvatore Salamone

About Salvatore Salamone

Salvatore Salamone is a physicist by training who has been writing about science and information technology for more than 30 years. During that time, he has been a senior or executive editor at many industry-leading publications including High Technology, Network World, Byte Magazine, Data Communications, LAN Times, InternetWeek, Bio-IT World, and Lightwave, The Journal of Fiber Optics. He also is the author of three business technology books.

Leave a Reply

Your email address will not be published. Required fields are marked *