Caching Patterns and Strategies
Caching Patterns and Strategies
In this lesson, we'll explore different caching patterns and strategies used in modern applications. Choosing the right caching pattern can significantly impact your application's performance, consistency, and complexity.
Cache-Aside (Lazy Loading)
The most common caching pattern where the application is responsible for reading and writing to the cache:
// Try to get from cache first
const cacheKey = `user:${userId}`;
let user = await redis.get(cacheKey);
if (user) {
// Cache hit
console.log('Cache hit');
return JSON.parse(user);
}
// Cache miss - fetch from database
console.log('Cache miss');
user = await db.users.findById(userId);
if (user) {
// Store in cache with 1 hour expiration
await redis.setex(
cacheKey,
3600,
JSON.stringify(user)
);
}
return user;
}
Advantages:
- Only requested data is cached (efficient memory usage)
- Cache failures don't prevent the application from working
- Simple to implement
Disadvantages:
- Cache miss penalty (extra latency on first request)
- Potential for stale data if database is updated directly
- Requires manual cache invalidation
Write-Through Caching
Data is written to the cache and database simultaneously:
// Update database first
const user = await db.users.update(userId, updates);
// Then update cache
const cacheKey = `user:${userId}`;
await redis.setex(
cacheKey,
3600,
JSON.stringify(user)
);
return user;
}
async function createUser(userData) {
// Create in database
const user = await db.users.create(userData);
// Immediately cache it
const cacheKey = `user:${user.id}`;
await redis.setex(
cacheKey,
3600,
JSON.stringify(user)
);
return user;
}
Advantages:
- Cache is always consistent with the database
- No cache miss penalty for recently written data
- Read-heavy applications benefit greatly
Disadvantages:
- Write latency (data written to two places)
- Unnecessary caching of data that might never be read
- More complex error handling
Write-Behind (Write-Back) Caching
Data is written to cache immediately and asynchronously written to the database:
const cacheKey = `user:${userId}`;
// Get current user data
let user = await redis.get(cacheKey);
user = user ? JSON.parse(user) : await db.users.findById(userId);
// Update user object
user = { ...user, ...updates };
// Write to cache immediately
await redis.setex(cacheKey, 3600, JSON.stringify(user));
// Queue database write for later (async)
await writeQueue.add({
operation: 'update',
table: 'users',
id: userId,
data: updates
});
return user;
}
// Background worker processes write queue
writeQueue.process(async (job) => {
const { operation, table, id, data } = job.data;
await db[table][operation](id, data);
});
Advantages:
- Very fast writes (cache-speed)
- Reduces database load significantly
- Can batch multiple writes together
Disadvantages:
- Risk of data loss if cache fails before database write
- Complex to implement correctly
- Eventual consistency issues
Read-Through Caching
Cache acts as a proxy that automatically loads data from the database when needed:
constructor(redis, db) {
this.redis = redis;
this.db = db;
}
async get(key, loader, ttl = 3600) {
// Try cache first
let data = await this.redis.get(key);
if (data) {
return JSON.parse(data);
}
// Cache miss - use loader function
data = await loader();
if (data) {
// Store in cache
await this.redis.setex(key, ttl, JSON.stringify(data));
}
return data;
}
}
// Usage
const cache = new CacheManager(redis, db);
const user = await cache.get(
`user:${userId}`,
() => db.users.findById(userId),
3600
);
Advantages:
- Abstracts cache logic from application code
- Consistent API for cached and non-cached data
- Easier to maintain and test
Refresh-Ahead Caching
Proactively refresh cache before it expires for frequently accessed data:
const data = await redis.get(key);
if (data) {
// Check TTL - if less than 25% remaining, refresh
const remainingTTL = await redis.ttl(key);
const refreshThreshold = ttl * 0.25;
if (remainingTTL < refreshThreshold) {
// Refresh asynchronously (don't wait)
loader().then(newData => {
redis.setex(key, ttl, JSON.stringify(newData));
}).catch(err => {
console.error('Refresh failed:', err);
});
}
return JSON.parse(data);
}
// Cache miss - load synchronously
const newData = await loader();
await redis.setex(key, ttl, JSON.stringify(newData));
return newData;
}
Advantages:
- Reduces cache miss penalty for hot data
- Users always get fast responses
- Keeps frequently accessed data fresh
Disadvantages:
- Increased complexity
- May refresh data unnecessarily
- Requires accurate access pattern prediction
Choosing the Right Pattern
- Cache-Aside: General-purpose, read-heavy workloads, when data changes infrequently
- Write-Through: When data consistency is critical, read-after-write scenarios
- Write-Behind: High-write throughput requirements, can tolerate eventual consistency
- Read-Through: When you want to abstract caching logic, consistent API
- Refresh-Ahead: Predictable hot data access patterns, zero-latency requirements
Hybrid Pattern Example
Combining multiple patterns for optimal performance:
constructor(redis, db) {
this.redis = redis;
this.db = db;
}
// Cache-aside for reads
async read(key, loader, ttl = 3600) {
let data = await this.redis.get(key);
if (data) return JSON.parse(data);
data = await loader();
if (data) {
await this.redis.setex(key, ttl, JSON.stringify(data));
}
return data;
}
// Write-through for updates
async write(key, saver, ttl = 3600) {
const data = await saver();
await this.redis.setex(key, ttl, JSON.stringify(data));
return data;
}
// Manual invalidation
async invalidate(key) {
await this.redis.del(key);
}
// Pattern invalidation (wildcards)
async invalidatePattern(pattern) {
const keys = await this.redis.keys(pattern);
if (keys.length > 0) {
await this.redis.del(...keys);
}
}
}