With an event-driven architecture, applications can respond to data as it is generated. The event-driven approach has become extremely popular in recent years thanks to both the explosion in data sources that generate events (IoT sensors, for example) and the development and adoption of technologies that process event streams, such as Hazelcast Jet and Apache Kafka®. The event-driven approach enables businesses to think of their operations and the data they generate as a series of ongoing events, rather than as a handful of metrics on a weekly or quarterly report.
As an example, consider the fictional characters Tina and Tara, who have lived in several locations. If we performed an address lookup in a traditional batch-based design with data updates, we would see their address as it is currently recorded.
With an event-driven architecture, we can still ask: “What is Tina and Tara’s address now?”
And we can also ask: “What was Tina and Tara’s address in 2014?”
Or: “What was Tina’s last address before she shared an address with Tara?”
Event-driven applications are common for use cases including IoT, fraud detection, payment processing, website monitoring, and real-time marketing. Event-driven applications often treat data as immutable, or unchangeable, making it easy to look up the values of data at previous points in time. So whenever information “changes,” what actually happens is that a new data point is created with a new time period.
Not all events necessarily trigger an action by the application. Consider the case of IoT sensor data. In an application looking for anomalies in the sensor data, there may be millions of non-anomalous events that occur without ever triggering the application to act.