Getting Real About Real-Time

Dale Kim | Jul 14, 2021

Many organizations today are looking to gain real-time responsiveness from their IT infrastructure, if not for gaining competitive advantage, then for not getting left behind. It’s not always obvious when your business is doing “well enough,” as you might have consistently positive engagements with your customers, but at the same time, are you doing as much business with them as you can? Are they purchasing all your upsell and cross-sell products? Are they accepting your promotional offers? If not, perhaps other businesses are presenting a better customer experience. 

Moving toward real-time responsiveness is a good strategy for gaining these missed business opportunities. The new Hazelcast Platform is the ideal technology for implementing that strategy. More on the platform soon…

Certainly, the term “real-time” has become very broad in scope, especially in the context of business data, so it’s worth a brief clarification. While many of us have thought of real-time as “acting instantaneously” or even “honoring a guarantee of action within a short, specified period of time,” real-time can sometimes be minutes or hours. It is often less about the amount of time, and more about accelerating your IT and business operations to be much faster than before. For example, if you run batch processes each night to analyze supply chain efficiency, it would likely be more advantageous to do the analysis continuously throughout the day. You can achieve much greater efficiency in your operations by adjusting to bottlenecks as you uncover them, versus at the end of the day or the next morning.

Continuous analysis leads to faster response times because the premise is to take action as soon as an event occurs. Each of those events can lead to an actionable insight, but of course, you need to interpret what those events mean, so you need to build the right context. For example, you can analyze customer interactions and tie that together with historical data to understand the situation, and then offer the right product at the right time. BNP Paribas used this strategy to increase offer conversion by 4x, simply by recognizing when a customer is ready for specific products based on their real-time actions. In their case, a customer might not have sufficient funds to withdraw from an ATM, so if the bank can immediately determine that the customer is credit-worthy, an offer for a loan at that moment will be relevant and more appealing. Contrast this to sending a loan offer 2 days later, when the customer might have already made other arrangements for funding, so the offer is no longer relevant.

In the past, real-time responsiveness often required a huge resource investment to keep up with ongoing changes in business, and most approaches to join real-time eventing with historical data at rest for context are expensive and SLA’s are too slow. Newer, modern technologies that are optimized for real-time capabilities make the journey much easier.

That’s why I’m excited about the launch of the Hazelcast Platform, which lets our customers build and deploy a wide range of real-time, intelligent applications in a much easier way. Its combination of in-memory storage and stream processing capabilities in a distributed, elastic architecture gives you the capabilities to add real-time advantages to your IT architecture, and ultimately business operations. Its lightweight packaging and familiar API are just a few features that make it easy to integrate with your existing systems. But Hazelcast is not merely about making things run faster, it’s about creating a competitive advantage and responding to all available business opportunities via real-time and high performance. As we like to say, we help you to “capture every value moment.”

It’s the perception of IT challenges that prevent many businesses from following through on their aspirations. But that perception is often closely associated with past experiences in which the addition of new capabilities required overhauling your systems. With Hazelcast, the goal is to supplement your existing IT architecture to leverage what you already have that works well, and to add new capabilities that make it better, and real-time enabled. We challenge the IT community to “think beyond the possible” and see what modern platforms like Hazelcast can help you accomplish.

What can you do with the Hazelcast Platform? One set of use cases entails streaming analytics in an event-driven architecture. You can act on data immediately to eliminate the latency that often leads to missed opportunities. You can send your customers the right product offer at the right time based on the context of their most recent interactions with your company, adjust the speed of an industrial machine’s motor to prevent overheating, or receive instant notifications when supplies need to be replenished. With Hazelcast, you get the performance headroom necessary to run many analytics jobs simultaneously without a huge cluster of servers, whether on-premises or in the cloud.

In some situations, real-time doesn’t have to be instantaneous, it just needs to be efficient. Another set of use cases entails on-demand, near-real-time analytics. If you have a large set of data spread across multiple data silos but you only need to drill down into a specific subset of that data, then you can deploy Hazelcast to accept a user-initiated query, ingest and filter the relevant data from across the many data sources, index and aggregate that data, store it in-memory, then let end-users run granular queries on that data. This architecture is useful for environments where it doesn’t necessarily make sense to deploy a full-blown, real-time business intelligence system due to costs and complexity. The Hazelcast advantage here is the speed of reading in data from the data sources. If the ingest/aggregation/indexing process takes a few seconds to complete, it might take several minutes for similar technology to be ready for analysis. And for more complex environments that might take Hazelcast a minute to complete, other technologies will take hours.

A final set of use cases I’d like to share here entails a data layer deployment known as a digital integration hub (DIH). A DIH fits into many types of application deployments and is especially useful when trying to improve throughput and lower latency to heavily used applications, especially customer-facing mobile apps. Instead of having these apps interface directly to the backend systems that weren’t expected to massively scale when they were built, you can use Hazelcast to store frequently accessed data that is delivered to the apps. Any data updates from the apps would be propagated to the backend systems to ensure those updates were permanently recorded. With the compute engine in Hazelcast, DIH data can be transformed, aggregated, and refreshed to optimize its operation. And you can also incorporate real-time data, especially from streaming data sources, to give customers access to more types of information.

There are more use cases related to multi-version machine learning inference, large-scale simulations, and microservices state storage and messaging, so take a look at our new offering and see if you can think beyond the possible and take on the IT challenges that will ultimately let your business maintain an edge.

Relevant Resources

View All Resources
About the Author

Dale Kim

Sr. Director, Technical Solutions

Dale Kim is the senior director of technical solutions at Hazelcast, and is responsible for product and go-to-market strategy for the real-time stream processing platform and the Viridian cloud-managed services. His background includes technical and management roles at IT companies in areas such as relational databases, search, content management, NoSQL, Hadoop/Spark, and big data analytics. Dale holds an MBA from Santa Clara, and a BA in computer science from Berkeley.