Core Concepts
Caching Patterns
Implement cache-aside, write-through, and TTL strategies in Redis for maximum performance.
Caching with Redis
Caching is the process of storing frequently accessed data in a fast, temporary store so you don't need to re-fetch or re-compute it on every request.
Redis is the industry standard for application-level caching. Understanding caching patterns lets you implement it correctly.
Cache-Aside (Lazy Loading)
The most common pattern. The application checks the cache first. On a miss, it fetches from the database and populates the cache.
Request → Check Redis → Cache HIT → Return data
→ Cache MISS → Fetch from DB → Store in Redis → Return dataPros: Only caches what is actually requested. Cache failure doesn't break the app.
Cons: First request is slow (cache miss). Stale data if DB updates aren't reflected.
Write-Through
Write to the cache and the database simultaneously on every update. The cache is always in sync.
Pros: No stale reads. Cache is always warm.
Cons: Every write is slower. Cached data may never be read (wasting memory).
Write-Behind (Write-Back)
Write to the cache immediately. Write to the database asynchronously in the background.
Pros: Write latency is near-zero.
Cons: Risk of data loss if Redis goes down before the async DB write.
TTL Strategy
Every cached value should have an expiry time (TTL — Time To Live). This ensures:
- Stale data eventually expires
- Memory doesn't grow unbounded
- Changes in the underlying data propagate over time
Choosing a TTL depends on how often data changes and how stale is acceptable:
- User profile: 5–15 minutes
- Product listings: 1–5 minutes
- Static config: hours or days
- Session tokens: match session duration (e.g. 24 hours)
Cache Invalidation
When data changes in your database, you need to invalidate (delete or update) the cached version.
// When user profile is updated
async function updateUserProfile(userId, data) {
await db.users.update({ id: userId }, data);
await redis.del(`user:${userId}`); // Invalidate cache
}Cache Stampede (Dog-Pile Effect)
When a popular cached item expires, many simultaneous requests all miss the cache and hammer the database.
Solution: Probabilistic Early Recomputation or Mutex Lock
async function getCachedData(key, fetchFn, ttl) {
const cached = await redis.get(key);
if (cached) return JSON.parse(cached);
// Acquire lock to prevent stampede
const lockKey = `lock:${key}`;
const locked = await redis.set(lockKey, '1', 'NX', 'EX', 5);
if (!locked) {
// Another process is fetching, wait and retry
await new Promise(r => setTimeout(r, 100));
return getCachedData(key, fetchFn, ttl);
}
try {
const data = await fetchFn();
await redis.setex(key, ttl, JSON.stringify(data));
return data;
} finally {
await redis.del(lockKey);
}
}Caching LLM Responses (AI Use Case)
LLM API calls are expensive. Caching identical prompts saves cost and latency.
import { createClient } from 'redis';
import Anthropic from '@anthropic-ai/sdk';
const redis = createClient();
const anthropic = new Anthropic();
async function cachedCompletion(prompt) {
const cacheKey = `llm:${Buffer.from(prompt).toString('base64')}`;
const cached = await redis.get(cacheKey);
if (cached) {
console.log('Cache HIT');
return JSON.parse(cached);
}
const response = await anthropic.messages.create({
model: 'claude-opus-4-5',
max_tokens: 1024,
messages: [{ role: 'user', content: prompt }]
});
await redis.setEx(cacheKey, 3600, JSON.stringify(response));
return response;
}Example
// Cache-aside pattern in Node.js
import { createClient } from 'redis';
const redis = createClient({ url: 'redis://localhost:6379' });
await redis.connect();
async function getUser(userId) {
const cacheKey = `user:${userId}`;
// 1. Check cache
const cached = await redis.get(cacheKey);
if (cached) return JSON.parse(cached);
// 2. Fetch from DB on cache miss
const user = await db.users.findById(userId);
// 3. Store in cache with 10 min TTL
await redis.setEx(cacheKey, 600, JSON.stringify(user));
return user;
}Want to run this code interactively?
Try in Compiler