The biggest downside of caching is the risk of returning stale data. The original data source might have been updated, but if the cache isn’t updated with that new data, then an older value will be returned. The question then becomes, how bad is it if you return an old value.
The TTL setting mentioned in the section above is one way to reduce the risk of stale data. The challenge then becomes how to calculate the right value. Note that having a very short TTL value will reduce the risk of returning stale data, but it might also reduce the benefits of the cache if cached data is updated at a rate faster than the data is updated in the backend store.
For systems where stale data can be detrimental, go with a refresh-ahead access pattern to ensure that any data updated in the backend store is immediately inserted into the cache, thus ensuring that cached data is not stale.
As mentioned in the previous section, Hazelcast Platform has features to easily support a refresh-ahead cache pattern, using change data capture (CDC), distributed processing capabilities, and/or stream processing capabilities to deliver updated data to the cache in real-time.