This is cahcing !
Caching is storing data temporarily so future requests for that data are faster. Instead of doing expensive operations repeatedly, you do them once, save the result, and reuse it.
Think of it like keeping frequently used items on your desk instead of walking to the storage room every time you need them.
Without caching, your database gets hammered with the same queries over and over. Your server calculates the same results repeatedly. Your users wait unnecessarily.
With caching, common requests return instantly. Databases handle less load. Servers do less work. Users are happy.
Speed is not just nice to have - it is a business metric. Amazon found that every 100ms of latency costs them 1% in sales. Caching makes things fast.
Browser Cache: Your browser stores images, CSS, JavaScript so repeat visits load instantly. Happens automatically.
CDN Cache: Content Delivery Networks cache your static files (images, videos, CSS) on servers worldwide. Users get files from nearby servers.
Database Query Cache: Database saves query results. Identical queries return cached results instead of hitting the database again.
Application Cache: Your app stores computed results in memory (Redis, Memcached) for quick access.
Page Cache: Entire HTML pages cached. User requests return pre-generated HTML instead of building it from scratch.
Master system design through bite-sized lessons built for early-career engineers. Build scalable, bulletproof systems with hands-on projects and real-world case studies that make complex concepts click.
Master the structured approach to tackling any system design problem, from understanding constraints to breaking down complex systems into manageable components.
Learn the four-step framework to determine if your system design is well-architected and production-ready.
Master the art of making your applications lightning fast by understanding caching fundamentals

Understand caching through a real-world analogy that makes perfect sense

Discover the three bottlenecks that slow down every computer system

Learn about the most popular caching solution that powers millions of applications
Without caching:
Same request 1 second later repeats steps 1-4. Wasteful.
With caching:
60x faster. Less database load. Better user experience.
Redis: In-memory data store. Blazing fast. Most popular choice for application caching.
Memcached: Simpler than Redis but very fast. Good for basic key-value caching.
Varnish: HTTP cache that sits in front of web servers. Caches entire pages.
Cloudflare: CDN with automatic caching for static assets worldwide.
Service Workers: Browser-based caching for progressive web apps.
Cache-Aside: App checks cache. If miss, query database and populate cache. Simple and common.
Read-Through: Cache handles database queries automatically. App only talks to cache.
Write-Through: Write to cache and database simultaneously. Keeps them in sync.
Write-Behind: Write to cache immediately, update database asynchronously. Faster writes but riskier.
Phil Karlton famously said: "There are only two hard things in Computer Science: cache invalidation and naming things."
The challenge: when data changes, how do you update or remove stale cache entries?
Time-Based Expiration: Set TTL (Time To Live). Cache expires after X seconds. Simple but can serve stale data.
Event-Based Invalidation: When data updates, explicitly clear related cache. More accurate but more complex.
Cache Tags: Tag entries by entity. Invalidate all entries with specific tags when that entity changes.
Twitter: Caches tweets, timelines, and user profiles. Redis stores hot data for instant access.
Netflix: CDN caches video chunks worldwide. Users stream from nearby servers, not central ones.
E-commerce Sites: Product listings cached heavily. Cart operations hit the database but product browsing is lightning fast.
News Sites: Articles cached at CDN edge. Millions of readers do not all hit the origin server.
Expensive Computations: If calculating something takes time, cache the result.
Frequent Queries: Same database query runs thousands of times per minute? Cache it.
Rarely Changing Data: Product catalogs, blog posts, user profiles - perfect for caching.
External API Calls: Cache responses to avoid rate limits and reduce latency.
Rapidly Changing Data: Stock prices, live sports scores - caching causes stale data.
User-Specific Data: Personalized content is harder to cache effectively.
Small Datasets: If a query is already fast, caching adds complexity without benefit.
Write-Heavy Workloads: If data changes constantly, cache invalidation overhead negates benefits.
Browsers and CDNs use HTTP headers to determine caching behavior:
Cache-Control: max-age=3600 means cache for 1 hour.
ETag: Fingerprint of content. Browser asks "is this still fresh?" before downloading again.
Expires: Hard deadline when cache becomes invalid.
Last-Modified: Timestamp of when content last changed.
Understanding these headers helps you optimize loading times.
Stale Data: Users see outdated information until cache expires or is invalidated.
Cache Stampede: Cache expires, 1000 requests hit database simultaneously. Solution: background refresh or locking.
Memory Limits: Caches consume RAM. Eviction policies (LRU - Least Recently Used) help manage this.
Cache Coherence: Multiple cache layers (browser, CDN, app) can get out of sync.
Start simple:
Example with Redis:
// Check cache first
const cached = await redis.get("products")
if (cached) return JSON.parse(cached)
// Cache miss - query database
const products = await db.query("SELECT * FROM products")
// Store in cache for 1 hour
await redis.setex("products", 3600, JSON.stringify(products))
return products
That is it. Caching working.
Caching is one of the highest-impact optimizations you can make. It reduces costs (fewer database queries, less CPU), improves user experience (faster responses), and increases reliability (less load on systems).
Every major application uses multiple layers of caching. Master caching strategies, and you will build faster, more scalable systems.