Node.js & Express

Caching Strategies with Redis

50 min Lesson 22 of 40

Caching Strategies with Redis

Redis (Remote Dictionary Server) is an in-memory data structure store that can be used as a database, cache, message broker, and queue. It's extremely fast, supports rich data structures, and is perfect for implementing caching strategies that dramatically improve application performance.

Why Use Redis for Caching?

Performance Impact: Redis operations typically complete in less than a millisecond. Compared to database queries (10-100ms) or API calls (100-1000ms), Redis can reduce response times by 10-100x, making it essential for high-performance applications.

Redis Benefits:

  • In-memory storage - extremely fast read/write operations
  • Rich data structures (strings, hashes, lists, sets, sorted sets)
  • Built-in TTL (time-to-live) for automatic cache expiration
  • Pub/Sub messaging for real-time features
  • Atomic operations and transactions
  • Persistence options for durability
  • Clustering and replication for scalability

Installing Redis

Install Redis on your system:

# macOS brew install redis brew services start redis # Ubuntu/Debian sudo apt update sudo apt install redis-server sudo systemctl start redis-server # Verify installation redis-cli ping # Should return: PONG

Install Redis client for Node.js:

# Using ioredis (recommended) npm install ioredis # Or using redis (official client) npm install redis

Connecting to Redis from Node.js

Create a Redis client using ioredis:

// config/redis.js const Redis = require('ioredis'); const redis = new Redis({ host: process.env.REDIS_HOST || 'localhost', port: process.env.REDIS_PORT || 6379, password: process.env.REDIS_PASSWORD, db: 0, retryStrategy: (times) => { const delay = Math.min(times * 50, 2000); return delay; }, maxRetriesPerRequest: 3 }); redis.on('connect', () => { console.log('Redis client connected'); }); redis.on('error', (err) => { console.error('Redis error:', err); }); redis.on('reconnecting', () => { console.log('Redis client reconnecting'); }); module.exports = redis;

Using the official redis client:

// config/redis.js const redis = require('redis'); const client = redis.createClient({ url: process.env.REDIS_URL || 'redis://localhost:6379', socket: { reconnectStrategy: (retries) => Math.min(retries * 50, 2000) } }); client.on('error', (err) => console.error('Redis Client Error', err)); client.on('connect', () => console.log('Redis Client Connected')); (async () => { await client.connect(); })(); module.exports = client;

Basic Redis Operations

Common Redis commands in Node.js:

const redis = require('./config/redis'); // String operations async function basicOperations() { // SET - store a value await redis.set('key', 'value'); // SET with expiration (in seconds) await redis.setex('key', 60, 'value that expires in 60 seconds'); // Or using ioredis: await redis.set('key', 'value', 'EX', 60); // GET - retrieve a value const value = await redis.get('key'); console.log(value); // "value" // DEL - delete a key await redis.del('key'); // EXISTS - check if key exists const exists = await redis.exists('key'); // 1 if exists, 0 if not // INCR - increment a number await redis.set('counter', 0); await redis.incr('counter'); // 1 await redis.incrby('counter', 5); // 6 // EXPIRE - set expiration on existing key await redis.set('temp', 'data'); await redis.expire('temp', 300); // expires in 5 minutes // TTL - check time to live const ttl = await redis.ttl('temp'); // returns seconds until expiration // MGET/MSET - multiple operations await redis.mset('key1', 'value1', 'key2', 'value2'); const values = await redis.mget('key1', 'key2'); // ['value1', 'value2'] }

Caching Strategies

1. Cache-Aside (Lazy Loading):

The most common caching pattern - application code manages both cache and database:

// Cache-aside pattern async function getUser(userId) { const cacheKey = `user:${userId}`; // Try to get from cache first const cached = await redis.get(cacheKey); if (cached) { console.log('Cache hit!'); return JSON.parse(cached); } // Cache miss - fetch from database console.log('Cache miss - fetching from database'); const user = await db.users.findById(userId); if (user) { // Store in cache for 1 hour await redis.setex(cacheKey, 3600, JSON.stringify(user)); } return user; } // Usage const user = await getUser(123);
Cache-Aside Pros: Simple to implement, cache only what's needed, resilient to cache failures (falls back to database).
Cons: Initial requests are slow (cache miss), cache can become stale if not invalidated properly.

2. Write-Through Cache:

Data is written to cache and database simultaneously:

// Write-through caching async function updateUser(userId, updates) { const cacheKey = `user:${userId}`; // Update database const user = await db.users.findByIdAndUpdate(userId, updates, { new: true }); // Update cache immediately await redis.setex(cacheKey, 3600, JSON.stringify(user)); return user; } // Always consistent but slower writes const updatedUser = await updateUser(123, { name: 'John Doe' });

3. Write-Behind (Write-Back) Cache:

Data is written to cache first, then asynchronously written to database:

// Write-behind caching const pendingWrites = new Map(); async function updateUserWriteBehind(userId, updates) { const cacheKey = `user:${userId}`; // Update cache immediately const user = { id: userId, ...updates, updatedAt: Date.now() }; await redis.setex(cacheKey, 3600, JSON.stringify(user)); // Queue database write pendingWrites.set(userId, { user, timestamp: Date.now() }); return user; } // Background process to flush to database setInterval(async () => { for (const [userId, data] of pendingWrites.entries()) { try { await db.users.findByIdAndUpdate(userId, data.user); pendingWrites.delete(userId); } catch (error) { console.error('Failed to write to database:', error); } } }, 5000); // Flush every 5 seconds
Warning: Write-behind caching is fast but risky. If the cache crashes before data is written to the database, you lose data. Use only when you can tolerate data loss or implement proper persistence.

Advanced Caching Patterns

Cache Invalidation:

// Cache invalidation utilities class CacheManager { constructor(redis) { this.redis = redis; } // Invalidate specific key async invalidate(key) { await this.redis.del(key); } // Invalidate keys by pattern async invalidatePattern(pattern) { const keys = await this.redis.keys(pattern); if (keys.length > 0) { await this.redis.del(...keys); } } // Invalidate multiple related keys async invalidateUser(userId) { await this.invalidatePattern(`user:${userId}:*`); await this.invalidate(`user:${userId}`); } // Touch (update TTL without changing value) async touch(key, ttl = 3600) { await this.redis.expire(key, ttl); } } // Usage const cacheManager = new CacheManager(redis); // When user updates profile await db.users.update(userId, updates); await cacheManager.invalidateUser(userId); // When user posts something await db.posts.create(postData); await cacheManager.invalidate(`user:${userId}:posts`);

Cache Stampede Prevention:

Prevent multiple simultaneous requests from hitting the database when cache expires:

// Prevent cache stampede using locks const locks = new Map(); async function getWithLock(key, fetchFunction, ttl = 3600) { // Try cache first const cached = await redis.get(key); if (cached) { return JSON.parse(cached); } // Acquire lock const lockKey = `lock:${key}`; const lockId = Math.random().toString(36); // Try to set lock (NX = only if not exists, EX = expiration) const acquired = await redis.set(lockKey, lockId, 'NX', 'EX', 10); if (acquired) { try { // We have the lock - fetch data const data = await fetchFunction(); await redis.setex(key, ttl, JSON.stringify(data)); return data; } finally { // Release lock (only if we still own it) const script = ` if redis.call("get", KEYS[1]) == ARGV[1] then return redis.call("del", KEYS[1]) else return 0 end `; await redis.eval(script, 1, lockKey, lockId); } } else { // Lock held by another request - wait and retry await new Promise(resolve => setTimeout(resolve, 50)); return getWithLock(key, fetchFunction, ttl); } } // Usage const user = await getWithLock( `user:${userId}`, () => db.users.findById(userId), 3600 );

Redis Data Structures for Caching

Hashes - Perfect for storing objects:

// Store user as hash async function cacheUserAsHash(userId, user) { const key = `user:${userId}`; // Store entire object await redis.hset(key, { id: user.id, name: user.name, email: user.email, role: user.role }); // Set expiration await redis.expire(key, 3600); } // Get specific fields async function getUserFields(userId, fields) { const key = `user:${userId}`; const values = await redis.hmget(key, ...fields); return values; } // Update single field async function updateUserField(userId, field, value) { const key = `user:${userId}`; await redis.hset(key, field, value); } // Usage await cacheUserAsHash(123, { id: 123, name: 'John', email: 'john@example.com', role: 'admin' }); const name = await redis.hget('user:123', 'name'); const [name, email] = await getUserFields(123, ['name', 'email']);

Lists - For queues and recent items:

// Cache recent posts async function addRecentPost(userId, post) { const key = `user:${userId}:recent_posts`; // Add to start of list await redis.lpush(key, JSON.stringify(post)); // Keep only last 10 posts await redis.ltrim(key, 0, 9); // Set expiration await redis.expire(key, 3600); } // Get recent posts async function getRecentPosts(userId, limit = 10) { const key = `user:${userId}:recent_posts`; const posts = await redis.lrange(key, 0, limit - 1); return posts.map(p => JSON.parse(p)); }

Sets - For unique collections:

// Cache user's followers async function cacheFollowers(userId, followerIds) { const key = `user:${userId}:followers`; await redis.sadd(key, ...followerIds); await redis.expire(key, 3600); } // Check if user is following async function isFollowing(userId, targetId) { const key = `user:${userId}:followers`; return await redis.sismember(key, targetId); } // Get mutual followers async function getMutualFollowers(userId1, userId2) { const key1 = `user:${userId1}:followers`; const key2 = `user:${userId2}:followers`; return await redis.sinter(key1, key2); }

Sorted Sets - For rankings and leaderboards:

// Leaderboard caching async function updateScore(userId, score) { await redis.zadd('leaderboard', score, userId); } // Get top 10 users async function getTopUsers(limit = 10) { const results = await redis.zrevrange('leaderboard', 0, limit - 1, 'WITHSCORES'); // Convert to array of objects const users = []; for (let i = 0; i < results.length; i += 2) { users.push({ userId: results[i], score: parseInt(results[i + 1]) }); } return users; } // Get user rank async function getUserRank(userId) { const rank = await redis.zrevrank('leaderboard', userId); return rank !== null ? rank + 1 : null; }

Session Management with Redis

Store Express sessions in Redis for scalability:

// Install dependencies // npm install express-session connect-redis const session = require('express-session'); const RedisStore = require('connect-redis').default; const redis = require('./config/redis'); app.use(session({ store: new RedisStore({ client: redis }), secret: process.env.SESSION_SECRET, resave: false, saveUninitialized: false, cookie: { secure: process.env.NODE_ENV === 'production', httpOnly: true, maxAge: 1000 * 60 * 60 * 24 // 24 hours } })); // Use sessions app.post('/login', async (req, res) => { const user = await authenticateUser(req.body); req.session.userId = user.id; req.session.role = user.role; res.json({ success: true }); }); app.get('/profile', (req, res) => { if (!req.session.userId) { return res.status(401).json({ error: 'Not authenticated' }); } res.json({ userId: req.session.userId }); }); app.post('/logout', (req, res) => { req.session.destroy((err) => { if (err) { return res.status(500).json({ error: 'Logout failed' }); } res.json({ success: true }); }); });

Redis Pub/Sub

Use Redis for real-time messaging between services:

// Publisher const Redis = require('ioredis'); const publisher = new Redis(); async function publishEvent(channel, data) { await publisher.publish(channel, JSON.stringify(data)); } // Publish events await publishEvent('user:created', { userId: 123, name: 'John' }); await publishEvent('order:completed', { orderId: 456, total: 99.99 }); // Subscriber const subscriber = new Redis(); subscriber.subscribe('user:created', 'order:completed', (err) => { if (err) { console.error('Subscribe error:', err); } }); subscriber.on('message', (channel, message) => { console.log(`Received message on ${channel}:`, JSON.parse(message)); // Handle events if (channel === 'user:created') { // Send welcome email, create user profile, etc. } else if (channel === 'order:completed') { // Send confirmation, update inventory, etc. } });

Exercise: Build a Rate-Limited API with Redis

  1. Create an Express API with multiple endpoints
  2. Implement rate limiting using Redis (e.g., 100 requests per hour per IP)
  3. Cache API responses for GET requests with appropriate TTLs
  4. Implement cache invalidation when data is updated via POST/PUT
  5. Add Redis health check endpoint
  6. Track API usage statistics in Redis sorted sets
  7. Create a dashboard showing cached vs database hits

Best Practices

Redis Best Practices:
  • Use appropriate TTLs - not too short (defeats caching) or too long (stale data)
  • Namespace your keys: user:123:profile instead of 123profile
  • Use Redis data structures appropriately (hashes for objects, sets for collections)
  • Implement cache warming for frequently accessed data
  • Monitor memory usage and implement eviction policies
  • Use pipelining for multiple operations to reduce network round trips
  • Enable persistence (RDB/AOF) for important cached data
  • Set up replication for high availability
// Pipeline multiple operations const pipeline = redis.pipeline(); pipeline.set('key1', 'value1'); pipeline.set('key2', 'value2'); pipeline.incr('counter'); pipeline.expire('key1', 60); const results = await pipeline.exec(); // Transaction (atomic operations) const result = await redis .multi() .incr('counter') .expire('counter', 60) .exec();

Redis is a powerful tool that can dramatically improve your application's performance. Master these caching strategies and you'll build faster, more scalable applications.