Use Cases

Real-Time Streaming Analytics

When recency and speed drive the value of your data, in-memory stream processing solutions from Hazelcast can elevate your business to new levels of performance.

Accelerate Applications With In-Memory Stream Processing from Hazelcast

Data is coming at you fast from every direction. To meet customer expectations, prevent fraud, and ensure smooth operations, batch processing simply won’t cut it. Traditional batch processing requires data sets to be completely available and stored in a database or file before processing can begin. With in-memory stream processing platforms, you can respond to data on-the-fly, prior to its storage, enabling ultra-fast applications that process new data at the speed with which it is generated.

Use Cases for Stream Processing

Your business is a series of continually occurring events. You launch products, run campaigns, send emails, roll out new apps, interact with customers via your website, mobile applications, and payment processing systems, and close deals, for example – and the work goes on and on. Events happen in real time, and your environment is always changing. To compete, you need to be able to quickly adjust to those changes.

Your applications require the real-time capabilities and insights that only stream processing enables. You can’t rely on knowing what happened with the business yesterday or last month. You need to know, and respond to, what is happening now. Event-driven businesses depend on modern in-memory streaming applications for:

  • Real-time streaming analytics
  • Alerting
  • Fraud detection
  • Payment processing
  • IoT data capture
  • Microservices
  • Log monitoring and analysis
  • Real-time advertising

Stream processing must be both fast and scalable to handle billions of records every second. Event streams are potentially unbounded and infinite sequences of records that represent events or changes in real-time.

Typical stream processing tasks include:

  • Cleaning data for downstream processing
  • Algorithmic analysis of streaming data
  • Joining multiple streams
  • Enriching streams with other information
  • Publishing notifications to subscribers

Where Does Stream Processing Fit Into My Application Architecture?

The Hazelcast stream processing capabilities–built on distributed, in-memory computing technology to leverage the speed of random access memory compared with disk–sits between event sources such applications and sensors, and destinations such as an alerting system, database or data warehouse, whether in the cloud or on-premises.

Developers build stream processing capabilities into applications with Hazelcast to capture and process data within microseconds to identify anomalies, respond to events, or publish the events to a data repository for longer-term storage and historical analyses.

With the Hazelcast stream processing capabilities, your applications can handle low latency, high throughput transactional processing at scale, while supporting streaming analytics at scale.

You can analyze streaming events in real-time, augment events with additional data before loading the data into a system of record, or power real-time monitoring and alerts.

Stream processing does not always eliminate the need for batch processing. Traditional batch processing may be necessary to provide a comprehensive view of historical data – think of BI reports, which may access data from a system of record that is much older than the data that lives in your stream processing platform. And batch processing enables organizations to leverage existing investments for use cases where the urgency of reacting to data is less important. In some architectures, the streaming analytics platform and batch processing system may sit side-by-side, or stream processing may occur prior to batch processing.

Hazelcast Stream Processing Features

Stream Processing Engine

Hazelcast provides the tooling necessary to build streaming data applications. It gives you a powerful processing framework to query the data stream and elastic in-memory storage to store the results of the computation.

Large Data Volumes

Hazelcast processing tasks, called jobs, are distributed across the cluster to parallelize the computation. Hazelcast is able to scale out to process large data volumes.

Distributed In-Memory Storage

Hazelcast has very high-speed integration with Hazelcast in-memory computing, which can store large amounts of data to join it to the data stream in microseconds. Latency can also be reduced by using in-memory storage for stream ingestion or publishing results.


Hazelcast works with streaming data in terms of “windows,” where a window represents a slice of the data stream, usually constrained for a period of time.

Hazelcast supports tumbling, sliding, and session windows.

Event Time and Late Events

The speed of the Hazelcast in-memory computing platform enables new levels of real-time predictive model servicing in support of delivering artificial intelligence solutions, as well as enabling real-time engineering and model retraining.

Fault Tolerance

Hazelcast provides simple fault-tolerant streaming computation with snapshots saved in distributed in-memory storage. Jobs restart automatically using the snapshots, and processing resumes where they left off.

Processing Guarantees

The need to trade-off performance and correctness in event processing systems may not allow firm guarantees. Hazelcast allows you to choose a processing guarantee at start time, choosing between no guarantee, at-least-once, or exactly-once.

Drive performance to new levels

Exceed Expectations

Impress technical and business users (and your customers) with instant response times to complex event processing requirements.

Meet Burst Requirements

Millions of people or devices hitting your system at the same time, and all expecting an instant response? This is exactly what an IMDG is designed to handle, and Hazelcast is the industry leader for in-memory solutions.

Improved Performance

Scale and improve your performance elastically, reducing hardware requirements and driving operational efficiency.

10s of 1000s
Transactions per second
High-Speed streaming data from multiple sources, devices, and networks

1.2 microseconds
Maximum application latency
Leverage high-speed stream processing with in-memory performance