"Nawa ooo, this thing is taking really long πͺ"; I heard this statement last week from a frustrated user who was using a certain app.
In modern mobile applications, managing API requests efficiently is crucial for providing a smooth user experience. Network requests are often slow and battery-intensive, and users may find themselves in situations with poor connectivity. This is where implementing a proper caching strategy becomes essential.
In this article, we'll explore how to build a custom caching interceptor for the popular Dio HTTP client in Flutter. Our solution will allow for flexible cache durations and seamless fallback to cached data when needed.
## The Problem
Consider these common scenarios in mobile apps:
- A user opens your app in an area with poor connectivity
- Your backend service experiences temporary downtime
- You want to reduce unnecessary API calls for data that changes infrequently
- You need to optimize battery usage by minimizing network operations
Without caching, each of these scenarios leads to a degraded user experience. Users might see endless loading indicators, error messages, or stale data.
## Enter the Caching Interceptor
Dio, one of the most popular HTTP clients for Flutter, provides an elegant way to intercept and modify requests and responses through its interceptor system. We can leverage this to implement a robust caching mechanism.
Here's our custom CacheInterceptor
implementation:
import 'dart:convert';
import 'package:dio/dio.dart';
import 'storage_interface.dart';
class CacheInterceptor implements Interceptor {
final CacheStorage storage;
final String dataKey = 'data';
final String timeStampKey = 'timestamp';
CacheInterceptor(this.storage);
Future<Response?> _getCachedResponse(
RequestOptions options, {
required Duration duration,
}) async {
try {
final cacheKey = '${options.uri}';
final cacheEntry = await storage.get(cacheKey);
if (cacheEntry == null) return null;
final Map<String, dynamic> decodedCache = jsonDecode(cacheEntry);
final cachedTime = DateTime.tryParse(decodedCache[timeStampKey]);
if (cachedTime == null) return null;
final timeInterval = DateTime.now().difference(cachedTime);
if (timeInterval > duration) {
await storage.remove(cacheKey);
return null;
}
return Response(
requestOptions: options,
data: decodedCache[dataKey],
statusCode: 200,
);
} catch (_) {
return null;
}
}
Future<void> _saveResponseToCache(Response response) async {
final cacheKey = '${response.realUri}';
final cacheEntry = jsonEncode({
dataKey: response.data,
timeStampKey: DateTime.now().toIso8601String(),
});
await storage.set(cacheKey, cacheEntry);
}
@override
void onRequest(
RequestOptions options,
RequestInterceptorHandler handler,
) async {
final durationResolver = options.headers['X-Cache-Duration'];
if (durationResolver is Duration) {
// Skip cache completely if Duration is zero
if (durationResolver == Duration.zero) {
return handler.next(options);
}
final cached = await _getCachedResponse(
options,
duration: durationResolver,
);
if (cached != null) {
return handler.resolve(cached);
}
}
handler.next(options);
}
@override
void onResponse(
Response response,
ResponseInterceptorHandler handler,
) async {
await _saveResponseToCache(response);
handler.next(response);
}
@override
void onError(DioException err, ErrorInterceptorHandler handler) {
handler.next(err);
}
}
## How It Works
Let's break down how this interceptor handles caching:
1. Storage Abstraction
First, we inject a CacheStorage
interface, making our interceptor independent of the actual storage implementation. This means you can use anything from shared preferences to a local database:
CacheInterceptor(this.storage);
The CacheStorage
interface might look something like:
abstract class CacheStorage {
Future<String?> get(String key);
Future<void> set(String key, String value);
Future<void> remove(String key);
}
2. Cache Key Generation
For each request, we create a unique key based on the URI:
final cacheKey = '${options.uri}';
In more complex scenarios, you might want to include query parameters, headers, or request method in the key generation.
3. Request Interception
The magic happens in the onRequest
method:
void onRequest(RequestOptions options, RequestInterceptorHandler handler) async {
final durationResolver = options.headers['X-Cache-Duration'];
if (durationResolver is Duration) {
// Skip cache completely if Duration is zero
if (durationResolver == Duration.zero) {
return handler.next(options);
}
final cached = await _getCachedResponse(
options,
duration: durationResolver,
);
if (cached != null) {
return handler.resolve(cached);
}
}
handler.next(options);
}
We check if the request includes a custom header X-Cache-Duration
specifying how long cached data should be considered valid. If we find valid cached data within that duration, we immediately resolve the request with the cached response, bypassing the network call entirely.
4. Response Caching
When a response is received, we cache it for future use:
void onResponse(Response response, ResponseInterceptorHandler handler) async {
await _saveResponseToCache(response);
handler.next(response);
}
The response is stored with the current timestamp, which we'll use later to determine if the cache is still valid.
5. Cache Validation and Expiry
Before serving cached data, we validate it:
final timeInterval = DateTime.now().difference(cachedTime);
if (timeInterval > duration) {
await storage.remove(cacheKey);
return null;
}
If the cache has expired, we remove the old entry and proceed with a network request.
Using the Interceptor
Integrating this interceptor with your Dio instance is straightforward:
final dio = Dio();
final cacheStorage = SharedPreferencesCacheStorage(); // Your implementation
dio.interceptors.add(CacheInterceptor(cacheStorage));
Now, any API call can specify its cache duration through a custom header:
// Cache for 1 hour
final response = await dio.get(
'https://api.example.com/data',
options: Options(
headers: {'X-Cache-Duration': Duration(hours: 1)},
),
);
// Bypass cache - always fetch fresh data
final freshResponse = await dio.get(
'https://api.example.com/data',
options: Options(
headers: {'X-Cache-Duration': Duration.zero},
),
);
Implementing the Storage Backend
You'll need to create a concrete implementation of the CacheStorage
interface. Here's a simple example using shared preferences:
import 'package:shared_preferences/shared_preferences.dart';
import 'storage_interface.dart';
class SharedPreferencesCacheStorage implements CacheStorage {
@override
Future<String?> get(String key) async {
final prefs = await SharedPreferences.getInstance();
return prefs.getString(key);
}
@override
Future<void> set(String key, String value) async {
final prefs = await SharedPreferences.getInstance();
await prefs.setString(key, value);
}
@override
Future<void> remove(String key) async {
final prefs = await SharedPreferences.getInstance();
await prefs.remove(key);
}
}
For larger responses, you might want to use a solution like Hive or SQLite instead.
Advanced Features
This basic implementation can be extended with several advanced features:
1. Cache Invalidation Patterns
You could add methods to invalidate cache by pattern:
Future<void> invalidateCache(Pattern pattern) async {
// Implementation to remove all cache entries matching the pattern
}
2. Cache-Control Headers Support
Support standard HTTP cache-control headers:
final cacheControl = response.headers['cache-control'];
// Parse and apply cache control directives
3. Offline-First Strategy
Automatically serve cached responses when offline:
if (await connectivityService.isOffline()) {
final cached = await _getCachedResponseIgnoringExpiry(options);
if (cached != null) {
return handler.resolve(cached);
}
}
4. Cache Compression
For large responses, consider compressing the cached data:
final compressed = GZipCodec().encode(utf8.encode(cacheEntry));
// Store compressed data
Common Pitfalls
While implementing caching, be aware of these potential issues:
Cache Bloat: Without proper management, your cache can grow indefinitely. Consider adding size limits or periodic cleanup.
Security Concerns: Avoid caching sensitive data like authentication tokens or personal information.
Stale Data: Balance between fresh data and performance. Critical features might need fresher data than less important ones.
Memory Usage: Large responses cached in memory can cause OOM errors. Use persistent storage instead.
Conclusion
A well-implemented caching strategy can dramatically improve your app's performance and user experience. This custom Dio interceptor provides a flexible foundation that you can adapt to your specific needs.
By intelligently caching API responses, your app can:
- Load faster
- Work better in poor network conditions
- Reduce server load
- Save battery life
- Provide a more responsive UI
Remember that caching is a trade-off between freshness and performance. The right balance depends on your specific application requirements and data characteristics.
What caching strategies have you implemented in your Flutter apps? Share your experiences in the comments below!
In this article, we've explored how to build a custom caching interceptor for the popular Dio HTTP client in Flutter. Our solution allows for flexible cache durations and seamless fallback to cached data when needed.
The code we discussed has been packaged and published for easy use in your projects. You can find it here:
Package Information
- Package Name: simple_dio_cache_interceptor
- GitHub Repository: https://github.com/gabbygreat/simple_dio_cache_interceptor
Installation
Add the package to your pubspec.yaml
:
dependencies:
simple_dio_cache_interceptor: ^latest_version
Check out the package for detailed usage instructions and examples. It provides a simple yet powerful solution for all your API caching needs in Flutter applications.
Happy coding!
Top comments (1)
Greatππ