How In-Memory Computing Enables a New Generation of Microservices

Watch Now
Webinar
/ Video
/ 60 min

The first generation of microservices was envisioned as stateless request-response endpoints. But it’s now clear that microservices must often maintain some state. For example, microservices tasked with running machine learning models or engaged in statistical classification must maintain the state of their models and their parameter weights. This brings us to one of the biggest challenges—where is that state stored? Options like RDBMSs are too slow, do not scale, and have inflexible schema models. Distributed in-memory caching, however, is the only widely adopted enterprise technology that offers high speed, scalability, and dynamic schema evolution.

In this webinar, we will discuss:

  • Why today’s business solutions need a next-generation microservices architecture.
  • Why microservices need to leverage in-memory computing technologies.
  • How you can get started with next-generation microservices.

Presented By

Watch Now