Cache is an important component of any application. In-memory caching can eliminate the bottlenecks of any application and provide predictable latency and fast response time to reach the growing mass of users. Implementing a Cache-as-a-Service across the organization will allow multiple applications to access managed in-memory cache instead of slow disk-based databases.
By separating the caching layer from the application layer, one can isolate the caching infrastructure from the application. Any applications or groups within the organization can add caching into their application without worrying about the cache implementation.
Hazelcast, the leading open source in-memory data grid, lets you implement a Cache-as-a-Service layer by providing a scalable, reliable and fast caching solution. Applications can use Hazelcast as side-cache to their database, or hide the database behind the caching service and allow Hazelcast to load data from RBDMS, NoSQL or other storage.
Hazelcast Caching Features
The following Hazelcast features are very useful for implementing Cache-as-a-Service.
- Native memory storage avoids garbage collection and provides predictable low latency across different applications without affecting each other
- Java, .NET and C++ applications can use native client libraries to access the cache. For other languages, applications can use Hazelcast REST and Memcached interfaces
- Hazelcast Management Center lets you monitor cache server nodes, connected clients, and used caches. It will now give notification when certain cluster management operations take too long, helping to identify performance and stability issues.
- WAN Replication lets you replicate the cache across multiple data centers.
- JCache and Map Interfaces are two standard APIs provided by Hazelcast. JCache is standard for JSR 107 and Map Interface is based on java.util.Map
- MapStore and CacheStore, respectively for Map and JCache, can be used to make Hazelcast automatically load and store data from a database
- Hazelcast High Density Memory Store (HDMS) for Near Cache (behind the JCache API) allows each app server to store hundreds of GB of cached data in memory locally (as opposed to the ~3.5 GB practical limit for on-heap data).
- High Density Memory Store for Near Cache allows each app server to access data faster: in microseconds instead of milliseconds.
- Cache-as-a-service can be updated independently of the applications (for minor-version updates). Applications using cache-as-a-service don’t need to upgrade in lock-step when cache-as-a-service is patched or upgraded, making cache-as-a-service much easier to use and giving independent applications more autonomy.
Products in this Use Case: