How Does Caching Work in ASP.NET Core?

Generating content served from an ASP.NET application can be slow and expensive particularly when there’s heavy use of database calls. Relational databases are historically challenging to scale.

Implementing caching means that the application doesn’t have to fetch and process the same data again and again. It’s one of the most effective ways to improve web performance.

Caching is most beneficial when data changes infrequently, yet the content relying on that data is used frequently. If our application uses a cache to store that content, then the next time it’s requested, our application can simply retrieve it from memory. Let’s explore caching in ASP.NET Core and help you decide which type of cache is best for your needs.

Where Can We Cache?

ASP.NET Core uses two caching techniques. In-memory caching uses the server memory to store cached data locally, and distributed caching distributes the cache across various servers. We’ll explore them both below.

In-Memory Cache

The simplest cache implementation in ASP.NET Core is represented by IMemoryCache. It runs in-process, so it’s fast.

A disadvantage of this single process model is that the cache data isn’t shared horizontally across scaled-out instances of the application. Because of this limitation, it’s best suited for single-server deployments.

Setting up IMemoryCache

The IMemoryCache service is automatically registered when using any of the boilerplate extension methods such as AddMvc and AddControllersWithView. For other scenarios, it’s available in the NuGet package, and you can add it as a service:

services.AddMemoryCache();

Controller constructors then request instances of IMemoryCache via dependency injection.

IMemoryCache Time-to-Live

Setting expiration times for cache entries are necessary to control cache growth. The following properties are available on the MemoryCacheEntryOptions API:

  • Sliding Expiration: If a cache entry is accessed during this period, the period resets.
  • Absolute Expiration: Absolute expiration date for the cache entry.
  • AbsoluteExpirationRelativeToNow: Expiration time, relative to the current time.

Tip: Combine a sliding expiration with an absolute expiration. If you use only sliding expiration, and the item is accessed more frequently than the expiration, it may never expire, and the data could become stale. For example, we can set a 5-second sliding expiration with an absolute expiration to force data to refresh every 30 seconds:

_ = await cache.GetOrCreateAsync<string>("key1", entry =>

{

    entry.SetSlidingExpiration(TimeSpan.FromSeconds(5));

    entry.AbsoluteExpirationRelativeToNow = TimeSpan.FromSeconds(30);

    return Task.FromResult("Example Value");

});

Controlling IMemoryCache Size

The ASP.NET Core runtime doesn’t limit cache growth, so you need to control growth from within the app itself. The MemoryCacheOptions class provides a property, SizeLimit, which sets a maximum size for the cache.

There’s no way for the cache to know how to measure the size of entries we might be putting in the cache. If SizeLimit is set, then every new entry must specify size. The developer must decide the unit of measurement.

Distributed Cache

A distributed cache runs independently of any single ASP.NET Core app process, so all running instances of your web app can share cached data. This ensures that data survives individual web app restarts and that the data is consistent between all web apps. Where an app is horizontally scaled, and each app has its in-memory cache, you need to perform additional work to ensure client sessions stick to the same instance.

A distributed cache needs to be backed by an external data store. ASP.NET Core comes with an in-memory IDistributedCache provider, but beware – it’s mostly useful for local testing because it’s not actually distributed.

ASP.NET Core also comes with IDistributedCache providers for SQL Server and Redis. Third-party providers exist for Postgres, MySQL, and more. To add an SQL Server cache, for example, you would add the following to your app’s Startup.cs file:

services.AddDistributedSqlServerCache(options =>

{

    options.ConnectionString = @"Disitributed cache connection string";

    options.SchemaName = "dbo";

    options.TableName = "ExampleCacheTable";

});

It’s also relatively easy to write your own IDistributedCache implementations.

Much of the interface is similar to IMemoryCache, including the same entry expiration configuration options (under the DistributedCacheEntryOptions class).

Session Cache

ASP.NET Core’s Session cache is similar to the caching options we’ve already discussed. But there’s one significant difference: the data it stores is private to the user making the current web request, not shared among all users.

In a multi-deployment environment (without sticky sessions forcing a client to use the same server), each request a client makes might hit a different instance of a web app. For a seamless experience, a user’s session state needs to be available on any of the instances.

The Microsoft.AspNetCore.Session package is included by default in ASP.NET applications to manage session state. To enable this middleware, modify Startup.cs

ASP.NET session storage must be backed by an IDistributedCache provider of any implementation. In the example below, we use the in-memory distributed cache. But remember, this won’t give us the truly distributed session store we’re after. 

public void ConfigureServices(IServiceCollection services)

{

    // ...

    services.AddDistributedMemoryCache();

    services.AddSession(options =>

    {      

        options.Cookie.HttpOnly = true;

        options.Cookie.IsEssential = true;

        options.IdleTimeout = TimeSpan.FromSeconds(30); // Short for testing

    });

}

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)

{

    // ...

    app.UseSession();

}

If you’re setting this up for the first time, be sure to check the documentation on middleware ordering at Microsoft.

The session data is accessible on controllers and Razor Pages HttpContext, in the Session object. We can add values directly to it. For example, we might add a display name:

HttpContext.Session.SetString("_DisplayName", displayName);

All data must be serialized. String and integer serializers are available out of the box, but complex types must be serialized. You would typically use JSON for serialization.

Which Cache is Best?

It’s usually best to go with a caching solution that can grow with your application.

Your best bet is to default to using IDistributedCache. You can even start by just running your cache on a single instance with the in-memory provider. By working with abstractions, you’ll be able to seamlessly swap in alternative providers whenever the need arises.

As the demands on your application grow, and you need to scale out horizontally, you can switch to using a database like SQL Server as an intermediate step. Later on, when usage increases further, you can easily change your provider again. You might turn to an external cache like Redis, a single-threaded cache, or Hazelcast, which is fully multi-threaded to deliver maximum performance.

You’re not limited to using an existing implementation of IDistributedCache either. You can wrap a tool like Hazelcast in your own implementation, giving your ASP.NET Core application drop-in access to an extremely fast in-memory database for caching. It’s a straightforward interface with async and sync methods for getting, refreshing, removing, and setting cache entries with sliding or absolute expirations. All you need to do is create a class that implements all the methods in the IDistributedCache interface.

This opens the door to additional possibilities. Hazelcast provides high-performance distributed data structures. When you’re ready to move beyond just using Hazelcast as a cache, you can use the Hazelcast .NET client to access more advanced functionality, including a variety of high-performance distributed data structures

Any cache implementation you use, whether backed by SQL Server, Redis, Hazelcast, or another backend, will support many more operations than get, refresh, remove, and set. If your use case requires advanced features or maximum performance, you might skip using IDistributedCache completely and inject your custom cache implementation into your MVC controllers and Razor Pages. You’ll lose the ability to seamlessly swap cache implementations, but when you’re operating at scale, performance takes precedence.

When you’re ready to set up your cache and want to give Hazelcast a try, download and get started for free.