Rate limiting acts as a crucial traffic control mechanism that helps maintain system stability and security by controlling how many requests each client can make within specific time intervals. This ensures fair resource distribution and protects against potential abuse or unintended heavy usage.
When we build an e-commerce API, we’re essentially creating a digital storefront that needs to handle various types of traffic. Without proper controls, we might face scenarios like:
Bots aggressively scraping our product catalog
Brute force attempts on our authentication endpoints
Legitimate users experiencing slowdowns due to excessive requests from others
Rate limiting solves these challenges by implementing a simple rule :
Copy
`X` number of requests per `Y` time window
For example, we might allow 100 requests per minute per IP address.
We’ll implement rate limiting using the @perseidesjs/medusa-plugin-rate-limit plugin, which uses Redis under the hood to track and limit requests. Redis acts as our request tracking system, maintaining a real-time count of requests from each client and their timestamps to enforce our rate limiting rules.First, let’s install our rate limiting plugin:
Copy
yarn add @perseidesjs/medusa-plugin-rate-limit@1
Then, we’ll add it to our Medusa configuration. We can start with some sensible defaults:
The plugin defaults are 10 requests per 60 seconds.
Now, let’s create a middleware that will enforce these rate limits:
src/api/middlewares/rate-limit.ts
Copy
import { type MedusaRequest, type MedusaResponse} from '@medusajs/medusa'import type { NextFunction } from 'express'import type { RateLimitService } from '@perseidesjs/medusa-plugin-rate-limit'export default async function rateLimit( req: MedusaRequest, res: MedusaResponse, next: NextFunction) { const rateLimitService = req.scope.resolve<RateLimitService>('rateLimitService') // Create a unique key for each client (using IP address) const clientKey = `rate_limit:${req.ip}` // Check if the client has exceeded their limit const allowed = await rateLimitService.limit(clientKey) if (!allowed) { // If not allowed, tell the client when they can try again const retryAfter = await rateLimitService.ttl(clientKey) res.set('Retry-After', String(retryAfter)) res.status(429).json({ message: "Whoa there! Please slow down a bit." }) } // Add rate limit info to response headers const remaining = await rateLimitService.getRemainingAttempts(clientKey) res.set('X-RateLimit-Remaining', String(remaining)) next()}
Finally, we’ll apply this middleware to our routes. We can be selective about which endpoints we want to protect:
src/api/middlewares.ts
Copy
import { MiddlewaresConfig } from '@medusajs/medusa'import rateLimit from './middlewares/rate-limit'export const config: MiddlewaresConfig = { routes: [ { matcher: '/store/auth/*', // Protect all auth endpoints for example middlewares: [rateLimit], }, { matcher: '/store/products', // Protect product listing for example middlewares: [rateLimit], } ],}
For development, Medusa provides a fake Redis instance out of the box. For production, make sure to configure a real Redis instance by setting the redis_url in your project config.
The plugin provides a default middleware that you can use to protect your routes.This middleware is configured with the default rate limits specified in the plugin options.
src/api/middlewares.ts
Copy
import { type MiddlewaresConfig } from '@medusajs/medusa'import { rateLimitRoutes } from '@perseidesjs/medusa-plugin-rate-limit'export const config: MiddlewaresConfig = { routes: [ { matcher: '/store/*', middlewares: [rateLimitRoutes], }, ],}