Caching
Implementing caching in Medusa
In the world of e-commerce, performance isn’t just about speed—it’s about revenue. According to Deloitte’s research, improving site speed by just 0.1 seconds can have dramatic effects:
- Retail conversions increase by 8.4%
- Average order value jumps by 9.2%
- Travel conversions improve by 10.1%
- Luxury brand page views per session increase by 8.6%
While these metrics focus on mobile performance, the same principles apply to desktop experiences. In today’s omnichannel commerce landscape, optimizing performance across all devices is crucial for business success.
Why Caching Matters
Picture your e-commerce store during a flash sale. Every time a customer views a product, your server:
- Queries the database for product details
- Calculates current stock levels
- Applies active discounts
- Formats the response
- Sends it back to the customer
With mobile commerce expected to account for three-quarters of total e-commerce sales, and desktop users expecting equally snappy experiences, optimizing your store’s performance isn’t optional—it’s essential. Caching solves this by storing frequently accessed data in a fast-access storage layer, serving pre-computed results instantly instead of calculating everything from scratch for each request.
In this small video, we showcase the performance benefits of caching in a Medusa store.
The first request takes more than 1s to complete, while the second request is almost instant (7ms).
Caching Strategies
Medusa comes with two built-in cache modules out of the box - an “In-Memory” cache for development and a “Redis” cache for production environments. These powerful caching tools can be leveraged in various ways throughout your application.
In this guide, we’ll explore practical implementations of caching strategies, focusing on implementing caching in your API routes.
Whenever you can access the “Medusa container”, you can resolve the cache module and use it as you want, for example, it is possible to implement caching in your workflows or API routes.
Custom API Route Caching
We’ll implement caching in a custom API route that fetches products. This example demonstrates how to check the cache before hitting the database, and how to store results in the cache for future requests:
By using this approach, we can significantly reduce the number of database queries and improve the overall performance of our API.
Depending on your use case, you might want to define a unique cache key for your query.
For example, you might want to cache different pages of products separately (e.g. medusa:products:page=1
and medusa:products:page=2
)
Core API Route Caching
While custom API routes give us direct control over caching implementation, what about Medusa’s core API routes?
Since we don’t have direct access to modify these routes, we can leverage middlewares to implement caching. This approach allows us to “intercept” requests and responses to add caching functionality.
Let’s implement caching for Medusa’s product listing endpoint (/store/products
) :
This middleware implementation uses a technique called “response interception”. Here’s how it works:
- Check Cache First: When a request hits the
/store/products
endpoint, we first check if we have a cached response - Return Cached Data: If cached data exists, we return it immediately without hitting the database
- Intercept Response: If no cache exists, we override the
res.json
method to:- Capture the response data
- Store it in cache for future requests
- Send the response to the client
This approach is particularly powerful because:
- It works with any core API route
- Requires no modification to the original route handlers
- Can be easily extended to handle query parameters
Good to know: You can also set the TTL (time-to-live) for the cache, to make sure the cached data expires after a certain amount of time, depending on your use case.
Next Steps
Was this page helpful?