Redis & Advanced Caching

Introduction to Caching

18 min Lesson 1 of 30

What is Caching?

Caching is a technique of storing frequently accessed data in a temporary storage location (cache) to reduce the time needed to access that data in the future. Instead of fetching data from a slow source (like a database or external API) every time, we retrieve it from a fast cache.

Key Concept: Caching trades memory for speed. By storing data in faster storage layers, we dramatically improve application performance.

Why Use Caching?

  • Performance: Reduce response times from seconds to milliseconds
  • Scalability: Handle more users with the same infrastructure
  • Cost Reduction: Decrease database load and API calls
  • Availability: Serve cached content even if backend services are slow or down
  • User Experience: Faster pages lead to happier users and better engagement
// Without caching - slow database query every time\n$products = DB::table('products')->where('featured', true)->get();\n\n// With caching - database query only once, then served from cache\n$products = Cache::remember('featured_products', 3600, function() {\n return DB::table('products')->where('featured', true)->get();\n});

Types of Caching

1. Browser Caching

The web browser stores static assets (CSS, JavaScript, images) locally on the user's device. Controlled by HTTP headers like Cache-Control and ETag.

// Setting cache headers in Laravel\nreturn response($content)\n ->header('Cache-Control', 'public, max-age=86400');

2. CDN Caching

Content Delivery Networks cache your content on servers distributed globally, serving users from the nearest location. Ideal for static assets and media files.

3. Application/Server-Side Caching

The application server caches computed results, database queries, and rendered views. This is where Redis and Memcached operate.

4. Database Query Caching

The database itself caches query results and execution plans. MySQL query cache, PostgreSQL shared buffers.

5. Object Caching

Caching entire objects or data structures in memory, allowing complex data to be retrieved without reconstruction.

Cache Hit vs Cache Miss

Understanding these concepts is crucial for cache performance:

  • Cache Hit: Requested data is found in cache - fast retrieval
  • Cache Miss: Requested data is NOT in cache - must fetch from slow source
  • Hit Ratio: Percentage of requests served from cache (higher is better)
Optimization Goal: Maximize cache hits by caching frequently accessed data with appropriate expiration times.
// Example: Cache hit scenario\n1. User requests product #123\n2. Check cache for 'product:123'\n3. Found in cache! (CACHE HIT) - return in 0.5ms\n\n// Example: Cache miss scenario\n1. User requests product #456\n2. Check cache for 'product:456'\n3. Not in cache (CACHE MISS)\n4. Query database - takes 50ms\n5. Store result in cache for next time\n6. Return to user

Benefits of Caching

  1. Reduced Latency: In-memory access is 100x faster than disk/database
  2. Lower Database Load: Fewer queries mean databases can handle more traffic
  3. API Rate Limit Protection: Cache external API responses to avoid hitting limits
  4. Improved Reliability: Serve stale cache if backend fails (graceful degradation)
  5. Cost Savings: Reduce need for expensive database scaling

When NOT to Cache

Caching isn't always the right solution. Avoid caching in these scenarios:

Don't Cache:
  • Highly Dynamic Data: Data that changes on every request (e.g., real-time stock prices)
  • User-Specific Sensitive Data: Personal information that must be fresh and secure
  • Low-Frequency Access: Data rarely accessed doesn't benefit from caching
  • Data Consistency Critical: When stale data could cause serious issues (financial transactions)
  • Small Performance Gain: If the original operation is already fast (<10ms)

Cache Invalidation Strategies

The hardest problem in computer science: knowing when to remove or update cached data.

  • Time-Based (TTL): Cache expires after X seconds/minutes/hours
  • Event-Based: Invalidate cache when underlying data changes
  • Manual: Explicitly clear cache when needed
  • LRU (Least Recently Used): Automatically remove oldest unused items when cache is full
// Time-based expiration\nCache::put('key', $value, 3600); // Expires in 1 hour\n\n// Event-based invalidation\nProduct::updated(function($product) {\n Cache::forget('product:' . $product->id);\n});

Caching Best Practices

  1. Cache data that is expensive to compute or retrieve
  2. Use appropriate TTL values based on data volatility
  3. Monitor cache hit ratios and adjust strategy accordingly
  4. Have a cache invalidation strategy from day one
  5. Consider cache warming for critical data
  6. Use cache keys that are descriptive and namespaced
  7. Implement fallback mechanisms for cache failures
Exercise: Identify three pages in your application that would benefit from caching. For each, determine:
  • What data should be cached?
  • How long should it be cached?
  • What event should invalidate the cache?
  • What is the fallback if cache fails?

In the next lesson, we'll dive into Redis, one of the most popular and powerful caching solutions available today.