Key Objectives
- Understand the different caching options available in .NET.
- Learn about cache eviction strategies and when to use them.
- Discover best practices for choosing cache keys.
- Explore how to measure cache efficiency using instrumentation and logging.
- See practical code samples for real-world caching scenarios.
Why Caching Matters
Caching is a fundamental technique for improving application performance and scalability. By storing frequently accessed data in a fast-access layer, you reduce database load, speed up response times, and can even improve reliability during transient back-end failures.
Fetching data from cache saves networks calls to database, query execution time which speeds up the overall response time.
Caching Options in .NET
.NET provides several built-in caching mechanisms:
In-Memory Cache (
IMemoryCache
)
Stores data in the memory of the application process. Fast, but data is lost on restart and not shared across servers.Distributed Cache (
IDistributedCache
)
Supports external cache stores like Redis or SQL Server. Data is shared across multiple app instances and survives restarts.Response Caching
Caches HTTP responses at the middleware level for web APIs.Output Caching (ASP.NET Core 7+)
Caches the output of controllers or Razor pages.
Cache Eviction Strategies
Eviction determines when items are removed from the cache. Common strategies in .NET:
- Absolute Expiration The item is removed after a fixed time, regardless of usage.
_memoryCache.Set(key, value, TimeSpan.FromMinutes(10));
- Sliding Expiration The item stays in the cache as long as it is accessed within a time window.
_memoryCache.Set(key, value, new MemoryCacheEntryOptions
{
SlidingExpiration = TimeSpan.FromMinutes(5)
});
- Manual Eviction Remove items explicitly when data changes.
_memoryCache.Remove(key);
- Size-based Eviction Set a size limit for the cache; least recently used items are evicted when the limit is reached.
Choosing Cache Keys
Cache keys should be:
Unique: Represent the data being cached (e.g., "user_123" for user with ID 123).
Consistent: Use a predictable pattern for easy invalidation.
Scoped: Include context if needed (e.g., "product_{id}_details").
Example:
string cacheKey = $"employee_{employeeId}";
Measuring Cache Efficiency
To know if your cache is effective, instrument your code to log:
Cache Hits: Data was found in the cache.
Cache Misses: Data was not found and had to be loaded.
Evictions: When items are removed due to expiration or manual removal.
Retrieval Times: Time taken to fetch from cache vs. database.
Sample Instrumentation:
var stopwatch = Stopwatch.StartNew();
if (_memoryCache.TryGetValue(key, out var value))
{
_logger.LogInformation("Cache hit for {Key} at {Time}", key, DateTime.UtcNow);
}
else
{
_logger.LogInformation("Cache miss for {Key} at {Time}", key, DateTime.UtcNow);
value = await LoadFromDatabaseAsync();
_memoryCache.Set(key, value, TimeSpan.FromMinutes(10));
}
stopwatch.Stop();
_logger.LogInformation("Retrieval for {Key} took {Elapsed} ms", key, stopwatch.ElapsedMilliseconds);
Advanced:
For distributed caches like Redis, you can use Redis server metrics or middleware like App Metrics for more detailed analysis.
Example: In-Memory Cache with Eviction and Instrumentation
public class EmployeeCacheService
{
private readonly IMemoryCache _memoryCache;
private readonly ILogger<EmployeeCacheService> _logger;
public EmployeeCacheService(IMemoryCache memoryCache, ILogger<EmployeeCacheService> logger)
{
_memoryCache = memoryCache;
_logger = logger;
}
public async Task<Employee> GetEmployeeAsync(int employeeId)
{
string key = $"employee_{employeeId}";
var stopwatch = Stopwatch.StartNew();
if (_memoryCache.TryGetValue(key, out Employee employee))
{
_logger.LogInformation("Cache hit for {Key} at {Time}", key, DateTime.UtcNow);
}
else
{
_logger.LogInformation("Cache miss for {Key} at {Time}", key, DateTime.UtcNow);
employee = await LoadEmployeeFromDb(employeeId);
_memoryCache.Set(key, employee, new MemoryCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10)
});
_logger.LogInformation("Cache set for {Key} at {Time}", key, DateTime.UtcNow);
}
stopwatch.Stop();
_logger.LogInformation("Retrieval for {Key} took {Elapsed} ms", key, stopwatch.ElapsedMilliseconds);
return employee;
}
public void EvictEmployee(int employeeId)
{
string key = $"employee_{employeeId}";
_memoryCache.Remove(key);
_logger.LogInformation("Cache evicted for {Key} at {Time}", key, DateTime.UtcNow);
}
}
Summary
- .NET offers several caching options: in-memory, distributed (Redis, SQL), and response/output caching.
- Choose the right eviction strategy for your scenario: absolute, sliding, manual, or size-based.
- Use clear, unique cache keys for easy management and invalidation.
- Instrument your cache usage to measure hits, misses, evictions, and performance.
- Caching is a powerful tool—use it wisely to boost your app’s speed and scalability!
What caching strategies have worked best for your .NET projects? Share your experience in the comments!
Top comments (0)