Application performance
.NET .NET Core Architecture Best Practices C# Caching Performance Visual Studio Web API

Improving Performance with In-Memory Caching of a .NET Core Web API

Welcome to today’s post.

In today’s post I will be showing you how to improve the performance of a .NET Core API application by using one of several methods that are at our disposal.

The method I will show involves using in-memory caching. Using in-memory caching may not be suitable in a single-server environment where the API services compete for limited memory resources, so it should be used where there is a high percentage of available memory.

Explaining Cached Content and Cached Memory

The cached memory is stored within and retrieved from the web server’s memory and not from the memory of the web API application, so the cache can be shared between other web applications of web API services running on the web server. 

The benefits if caching content from an API are as follows:

  • Increase in scalability of the service.
  • Increase in performance.

Cached content is suitable for data that changes infrequently, such as lookup reference data. Data which is transactional in nature is not suited for caching as the data will become stale whenever the cached content is retrieved.  

The greatest performance gains in an application are realized where the cached response data content does not change in the backend often.

Using In-Memory Caching in a .NET Core Application

To use in-memory caching we will need to install the following NuGet package in Visual Studio:

Microsoft.Extensions.Caching.Memory

The NuGet package is shown below and will vary depending on your version of .NET framework:

After installation of the package, there are no further configurations that are needed to make an in-memory service available to a web application or web API service application. The IMemoryCache interface and its methods will be available for use once it is injected into a service class.

To use the in-memory cache we can inject it through a constructor as follows:

private readonly IMemoryCache _memoryCache;

public BookController(IMemoryCache memoryCache, … )
{
    _memoryCache = memoryCache;
    …
}

In the next section, I will show how to use the in-memory cache API to retrieve and set entries within a cache.

Using the In-Memory Cache API

To make use of the in-memory caching, we need to first make use of the in-memory cache API. The following are two useful methods we can use to retrieve and set an entry within a cache:

bool TryGetValue<T>(key-name, out object-value))

When an entry of type T can be retrieved from the cache that contains a value with a key named key-name, the above method returns true, and the entry is assigned to the output parameter object-value.

There are situations where a value cannot be retrieved and those are when:

  • The key name does not exist in the cache’s key value collection.
  • The cache’s absolute expiry date and time has elapsed beyond the current date and time.
  • The cache’s sliding expiry date and time has elapsed beyond the current date and time.

If the entry cannot be retrieved, then we set the cached key to the latest data from the backend data source. Below is how we attempt to retrieve the cached entry and when we cannot retrieve it we then proceed to set the cached entry:

List<BookViewModel> listEntries;

try
{
    if (!_memoryCache.TryGetValue<List<BookViewModel>>(
        "_BookListEntries", out listEntries))
    {
        // add latest data to cache
        List<BookViewModel> bvm = new List<BookViewModel>();
        var books = await _bookService.GetBooks();

        var cacheEntryOptions = new MemoryCacheEntryOptions()
            .SetAbsoluteExpiration(DateTime.Now.AddMinutes(3));

        _memoryCache.Set<List<BookViewModel>>(
            "_BookListEntries", 
            books, 
            cacheEntryOptions
        );

        return books;
   	}
    return listEntries;
}
catch (Exception ex)
{
    throw new GeneralException(ex.Message);
}

In the next section, I will show how to control the expiry of cache entries.

Configuring Cache Expiry Options

We can also configure cached entry options. There are two common expiration options we can configure, and these are:

  • Absolute expiration.
  • Sliding expiration.

Now, absolute expiration is when the expiry of the cache is set to an absolute date and time. This is usually needed when the cache will not be accessed. The absolute expiration set to three minutes in the excerpt below:

var cacheEntryOptions = new MemoryCacheEntryOptions()
    .SetAbsoluteExpiration(DateTime.Now.AddMinutes(3));

The sliding expiration option resets the expiry date time whenever the cache is accessed. The start of the new expiry date time is set to the date time when the cache was accessed, and the end of the expiry date time is set to the start plus the duration of the sliding expiry.

Below is an example where we reset the sliding expiry to three minutes:

var cacheEntryOptions = new MemoryCacheEntryOptions()
    .SetSlidingExpiration(TimeSpan.FromMinutes(3));

When we run our application and the cache is set, on inspection we see the list of books from our backend data:

When we inspect the _memoryCache variable we can see the cache entry that we have set consists of one object, which is the list of books:

And the value of the key itself consists of a list of values:

The expiration date is set within the _lastExpirationScan property:

The absolute expiration is also set as shown:

If we wait until the expiry date time and re-inspect the memory cache, we notice that the count of the cache is zero:

Once this occurs, the next calls to TryGetValue() returns false and we are required to reset the cache key from the backend data.

An additional configuration we can make is to limit the total size of the cache, or to limit the number of entries within the cache. Limiting the total size is done using the SetLimit property. This is useful when we want to prevent the cache from getting too large and consuming more memory than we need. Limiting the size of cache entries is with the SetSize property. This is used to limit each entry to a fixed size.

We can see how to manage and control performance using in-memory in a .NET Core application. In cases where infrequently changing lookup data is stored and retrieved, performance gains are achieved with scalability benefits in environments with sufficient available memory.

That is all for today’s post.

In a future post I will explain how to implement distributed caching, which can increase the scalability of our application or API even further and allow us to use our caching within the cloud. I hope you found this post useful and informative.

Social media & sharing icons powered by UltimatelySocial