Implementing Redis-based Throttling in NestJS
When developing APIs, controlling the rate of incoming requests is crucial to maintaining server stability and preventing abuse. NestJS provides an in-built throttling mechanism through the @nestjs/throttler
package. While this works well in most cases, it uses in-memory storage by default, which means that if the server restarts, any tracking data like request counts and time-to-live (TTL) for rate-limiting is lost.
To solve this, Redis can be used as an external store for throttling data. Redis, a fast, in-memory data store, ensures persistence across server restarts and enables distributed rate-limiting in microservices or multi-instance setups. This document will walk you through setting up Redis-based throttling in a beginner-friendly way.
Key Benefits of Using Redis for Throttling:
Persistence: With Redis, the server restart won't reset the rate limit. If a user hits the limit and the server restarts, Redis will retain this information, and the limit will still apply.
Distributed Throttling: Redis allows multiple instances of your NestJS service to share the same rate-limit information. This is ideal for horizontally scaled applications (e.g., a load-balanced microservice architecture).
Performance: Redis is designed to handle high-throughput operations, making it well-suited for maintaining fast access to request counters.
Prerequisites
NestJS application set up.
Redis installed on your system, or use a cloud service like RedisLabs or a Dockerized Redis instance.
Step 1: Install Required Packages
Start by installing the necessary libraries. You'll need @nestjs/throttler
, @nest-lab/throttler-storage-redis
, and ioredis
(a Redis client for Node.js).
Run the following command:
npm install @nestjs/throttler @nest-lab/throttler-storage-redis ioredis
Step 2: Update the AppModule
Next, configure the Redis-based storage for the throttling mechanism in your AppModule
.
import { Module } from '@nestjs/common';
import { ThrottlerGuard, ThrottlerModule } from '@nestjs/throttler';
import { ThrottlerStorageRedisService } from '@nest-lab/throttler-storage-redis';
import { Redis } from 'ioredis';
import { APP_GUARD } from '@nestjs/core';
import { AppController } from './app.controller';
import { AppService } from './app.service';
@Module({
imports: [
ThrottlerModule.forRoot({
throttlers: [
{
ttl: 20000,
limit: 2,
},
],
storage: new ThrottlerStorageRedisService(
new Redis({
host: 'localhost', // Redis server host
port: 6379, // Redis server port
}),
),
}),
],
controllers: [AppController],
providers: [
AppService,
{
provide: APP_GUARD,
useClass: ThrottlerGuard,
},
],
})
export class AppModule {}
Step 3: Understanding the Configurations
ttl: The time window in seconds (e.g.,
100
seconds) during which the rate limit applies.limit: The maximum number of requests a client can make within the
ttl
window (e.g.,5
requests).Redis configuration: The Redis connection is configured using the
ioredis
library, with details like the Redis server's host and port.
Step 4: Apply Throttling to Specific Routes
Use the @Throttle
decorator in your controllers to apply rate limiting on specific routes.
import { Controller, Get } from '@nestjs/common';
import { Throttle } from '@nestjs/throttler';
import { AppService } from './app.service';
@Controller()
export class AppController {
constructor(private readonly appService: AppService) {}
@Throttle({
default: {
limit: 5,
ttl: 100000,
},
}) // 5 requests per 100 seconds
@Get()
getHello(): string {
return this.appService.getHello();
}
}
Step 5: Skip Throttling for Certain Routes
You may want certain routes to be exempt from throttling, such as public health-check or authentication routes. You can skip throttling using the SkipThrottle
decorator.
import { Controller, Get } from '@nestjs/common';
import { SkipThrottle } from '@nestjs/throttler';
@Controller()
export class AppController {
@SkipThrottle() // Throttling disabled for this route
@Get('health-check')
healthCheck(): string {
return 'OK';
}
}
Step 6: Run Redis and Test Throttling
Run Redis locally or via Docker:
docker run -d -p 6379:6379 redis
Start your NestJS server:
npm run start
Test the rate limit by making repeated requests to the throttled API route (
/
in this case). After hitting the limit (5 requests in 100 seconds), the server will return a429 Too Many Requests
error for further hits.Redis persistence: Even if the server is restarted, Redis will retain the hit counts and the TTL, ensuring that throttling is applied correctly across server restarts.
Conclusion
By integrating Redis with the NestJS Throttler, you ensure robust and persistent rate limiting across server instances and restarts. Redis provides a powerful mechanism to manage rate-limiting state, offering both scalability and reliability for high-traffic APIs.
This setup ensures:
Persistence: No loss of rate limit data on server restart.
Scalability: Distributed rate limiting across multiple server instances.
Efficiency: Redis handles high-throughput, low-latency operations efficiently.
Incorporating Redis is a critical step for production-ready APIs, especially in distributed or cloud-based environments.
Subscribe to my newsletter
Read articles from Muhammad Sufiyan directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
Muhammad Sufiyan
Muhammad Sufiyan
As a former 3D Animator with more than 12 years of experience, I have always been fascinated by the intersection of technology and creativity. That's why I recently shifted my career towards MERN stack development and software engineering, where I have been serving since 2021. With my background in 3D animation, I bring a unique perspective to software development, combining creativity and technical expertise to build innovative and visually engaging applications. I have a passion for learning and staying up-to-date with the latest technologies and best practices, and I enjoy collaborating with cross-functional teams to solve complex problems and create seamless user experiences. In my current role as a MERN stack developer, I have been responsible for developing and implementing web applications using MongoDB, Express, React, and Node.js. I have also gained experience in Agile development methodologies, version control with Git, and cloud-based deployment using platforms like Heroku and AWS. I am committed to delivering high-quality work that meets the needs of both clients and end-users, and I am always seeking new challenges and opportunities to grow both personally and professionally.