Open Source Projects:
Pricing
Chat
Contact
Back to top

The Data Processing Challenge for Continuous Intelligence

February 04, 2019
The Data Processing Challenge for Continuous Intelligence

The fundamental lesson from Charles Darwin’s theory of natural selection is that adaptation is the root of continuity of a species’ existence. The digital age has presented businesses the same challenge: Today’s businesses will survive or gradually die out based on whether or not they adapt to the new reality.

Adaptation is a large concept. It’s not the product of technology refresh, nor is it achieved by “lift and shift” movements of workloads to the Cloud. To adapt is to evolve based on a fundamentally changed worldview. The adapters will be the new winners.

As I’ve shared in the past, digital transformation in practical terms is about making an organization “data capable” – enabling the organization to engage the groundswell of digital data and continuously refine it into a byproduct that can inform decisions or even take action instantly.

This capability is increasingly becoming known as “continuous intelligence” because this isn’t a one-and-done event. Continuity of survival is directly correlated with the immediate continuity of intelligence in an ever-changing environment.

Gartner has identified 6 attributes of continuous intelligence:

  • Fast: Everything must be in real-time
  • Smart: Feeding all data into analytics platforms and machine learning
  • Automated: Limiting or eliminating human intervention to ensure insights are untainted
  • Always-on: 24x7x365 (or 366) days a year, no excuses
  • Embedded: Integrated into the business, as well as the applications it supports
  • Outcome focused: Responding in a timely manner and delivering measurable value

Servicing these attributes requires a fresh look at data processing technologies and their ability to power adaptation.

For decades the central data processing engines have been databases – our Systems of Record. While recent years have seen an expansion in the type of databases in use, they all optimize data preservation over super-fast continuous processing. After all, what good is a database that loses data? We need our Systems of Record to do what they were designed for; we should not expect them to serve the new needs of continuous intelligence.

This reality is driving the rapid adoption of a new layer of data processing in IT architectures. This new layer is complementary to Systems of Record and is uniquely suited for the challenge at hand. Its design point is speed and scale of data processing and relies upon the Systems of Record to do the age-old job of data preservation (even if the data types are new-age).

This new data processing layer can be deployed centrally and be pushed to the edge; it can generate insights and transact; it can train machine-learning models and execute algorithms; it can process data at rest (finite data) as well as data in-flight (infinite data). Ultimately, it is the power behind continuous intelligence, making it the enabler of adaptability.

The main product categories that comprise this new layer or “platform” conduct their high-speed data processing jobs in-memory. For data at rest, the tool is an In-Memory Data Grid; for data in-flight, it’s Stream Processing. Both conduct their jobs types at near light-speed; if the data then needs to be preserved it pushes that to the Systems of Record asynchronously where it can be stored at their respectively slower speeds.

These in-memory data processing technologies were born out of the necessity for speed, scale and insights. They’re lightweight, embeddable and measure their performance in microseconds. When the two are combined they create an extremely powerful platform capable of handling data at rest and in-motion. They are the new data processing foundation for continuous intelligence.

Enterprises at the forefront of digital adaptation are already running this new data processing layer in production systems. The world’s largest Financial Services and eCommerce companies, as well as Communication Service Providers, have all introduced this new data layer as they adapt to the challenge of continuous intelligence. As Gartner shows these technologies moving past the hype cycle and into the maturation phase, others are following their lead.

The opportunity to adapt is here today. The time to adapt is limited.

About the Author

About the Author

Kelly Herrell

Kelly Herrell

Chief Executive Officer

Kelly has led the growth of four innovative companies from early stage to market-leading entities, covering a broad span of compute and networking. At Hazelcast, Kelly brings his unique experience of driving high-value innovation, including open-source models, into the infrastructure of the world's largest customers. Prior to joining Hazelcast in July 2018, he was SVP & GM of the Software Business Unit at Brocade Communications, the result of Brocade's acquisition of his former company, Vyatta, which pioneered software-defined networking and delivered the most widely used software networking operating system in the world. Kelly has a Bachelor's degree from Washington State University where he is an advisor to the Honors program and an MBA from Cornell University where he is a guest lecturer.

Latest Blogs

Hazelcast Delivers Ultra-Fast Cloud Application Performance in IBM Cloud Pak for Applications

Hazelcast Delivers Ultra-Fast Cloud Application Performance in IBM Cloud Pak for Applications

Accelerating the System of Now

Accelerating the System of Now

Integrity: For Your Data, For Your Business

Integrity: For Your Data, For Your Business

View all blogs by the author

Free Hazelcast Online Training Center

Whether you're interested in learning the basics of in-memory systems, or you're looking for advanced, real-world production examples and best practices, we've got you covered.