Caching Strategies¶
Caching stores frequently accessed data in fast storage (RAM) to reduce latency and database load. Key patterns: Cache-Aside (app manages cache), Write-Through (write to cache + DB), Write-Behind (write to cache, async to DB), Read-Through (cache loads from DB on miss). Eviction policies: LRU (Least Recently Used), LFU (Least Frequently Used), TTL (Time-To-Live).
Key Concepts¶
Deep Dive: Caching Patterns
1. Cache-Aside (Lazy Loading) — most common
Read: App → Cache → Hit? Return
→ Miss? App → DB → Store in Cache → Return
Write: App → DB → Invalidate Cache
public User getUser(Long id) {
User cached = cache.get("user:" + id);
if (cached != null) return cached;
User user = db.findById(id);
cache.put("user:" + id, user, Duration.ofMinutes(30));
return user;
}
2. Write-Through — write to cache AND DB synchronously
✅ Cache always consistent. ❌ Higher write latency.3. Write-Behind (Write-Back) — write to cache, async to DB
✅ Fast writes. ❌ Risk of data loss if cache crashes before DB write.4. Read-Through — cache itself loads from DB
✅ Simpler app code. ❌ Cache library must support it.Deep Dive: Eviction Policies
| Policy | Description | Best For |
|---|---|---|
| LRU | Remove least recently used | General purpose |
| LFU | Remove least frequently used | Popular items stay |
| FIFO | Remove oldest entry | Simple use cases |
| TTL | Remove after time expires | Session data, API responses |
| Random | Remove random entry | Uniform access patterns |
Redis eviction policies:
Deep Dive: Cache Invalidation Problems
"There are only two hard things in CS: cache invalidation and naming things."
Stale data:
1. Cache has User{name: "John"}
2. DB updated to User{name: "Jane"}
3. Cache still returns "John" until TTL expires or invalidated
Cache stampede (thundering herd):
Solutions: - Lock: Only one request fetches, others wait - Stale-while-revalidate: Return stale data, refresh async - Probabilistic early expiration: Each request has small chance to refreshDouble-write inconsistency:
Solution: Delete cache instead of updating it. Let next read repopulate.Deep Dive: Multi-Level Caching
Request → L1 (In-Process Cache, e.g. Caffeine)
→ L2 (Distributed Cache, e.g. Redis)
→ L3 (CDN, e.g. CloudFront)
→ Database
Spring Boot caching:
@EnableCaching
@Configuration
public class CacheConfig { }
@Service
public class UserService {
@Cacheable(value = "users", key = "#id")
public User findById(Long id) { return userRepo.findById(id).orElseThrow(); }
@CacheEvict(value = "users", key = "#id")
public void updateUser(Long id, UserRequest req) { ... }
@CacheEvict(value = "users", allEntries = true)
public void clearAll() { }
}
Common Interview Questions
- What are the different caching strategies?
- What is cache-aside vs write-through?
- How do you handle cache invalidation?
- What is a cache stampede? How do you prevent it?
- What eviction policies do you know?
- When would you use Redis vs an in-process cache?
- How does
@Cacheablework in Spring? - What is a CDN? Is it a form of caching?