Response Caching
ELI5 — The Vibe Check
Response caching is saving API responses so you don't have to recalculate them every time. If 1,000 people ask for the same product page, why hit the database 1,000 times? Cache the response and serve it instantly. It's leftovers for your API.
Real Talk
Response caching stores computed HTTP responses to serve subsequent identical requests without re-executing business logic or database queries. It can occur at multiple layers: application memory, Redis/Memcached, CDN, or browser cache. Cache-Control headers, ETags, and invalidation strategies determine freshness and consistency.
Show Me The Code
from fastapi_cache.decorator import cache
@app.get('/products')
@cache(expire=300) # Cache for 5 minutes
async def list_products():
return await db.fetch_all('SELECT * FROM products')
When You'll Hear This
"Add response caching to the product list endpoint — it barely changes." / "Cache invalidation is the other hard problem in computer science."
Related Terms
Caching
Caching is saving the result of a slow operation so you can reuse it quickly next time.
CDN (Content Delivery Network)
A CDN is a network of servers spread around the world that store copies of your files.
ETag
An ETag is like a fingerprint for a response.
Redis
Redis is an incredibly fast database that lives entirely in memory (RAM). It's used as a cache, a session store, and a message queue.