Kafka with NestJS: Building Robust Producers and Consumers for Event-Driven Microservices


As event-driven architectures gain traction, the combination of Apache Kafka and NestJS has emerged as a compelling solution for building scalable, decoupled microservices.
In this guide, we’ll explore how to integrate Kafka with NestJS using the built-in microservices module, focusing on how to build and configure both Kafka producers and Kafka consumers in a real-world Node.js application.
1. Why Kafka with NestJS?
NestJS brings structure and scalability to Node.js applications through modularity and strong TypeScript support. Kafka, on the other hand, provides:
High-throughput, fault-tolerant message streaming
Real-time data pipelines
Decoupled communication between services
When combined, they enable the development of reactive systems that are both performant and maintainable.
2. Setting Up the NestJS Project
Start by creating a new NestJS project:
$ npm i -g @nestjs/cli
$ nest new kafka-microservice
Install required Kafka dependencies:
$ npm install kafkajs @nestjs/microservices
Note: NestJS uses KafkaJS under the hood when working with Kafka microservices.
3. Creating a Kafka Producer in NestJS
Let’s start with a producer that sends messages to a Kafka topic.
3.1 Kafka Config Module
Create a file kafka.config.ts
:
import { KafkaOptions, Transport } from '@nestjs/microservices';
export const kafkaConfig: KafkaOptions = {
transport: Transport.KAFKA,
options: {
client: {
clientId: 'nestjs-producer',
brokers: ['localhost:9092'],
},
producerOnlyMode: true,
},
};
3.2 Producer Service
Create producer.service.ts
:
import { Inject, Injectable, OnModuleInit } from '@nestjs/common';
import { ClientKafka } from '@nestjs/microservices';
@Injectable()
export class ProducerService implements OnModuleInit {
constructor(@Inject('KAFKA_SERVICE') private readonly kafkaClient: ClientKafka) {}
async onModuleInit() {
await this.kafkaClient.connect();
}
async sendMessage(topic: string, message: any) {
return this.kafkaClient.emit(topic, message);
}
}
3.3 Registering the Kafka Client
In app.module.ts
:
import { Module } from '@nestjs/common';
import { ClientsModule, Transport } from '@nestjs/microservices';
import { ProducerService } from './producer.service';
@Module({
imports: [
ClientsModule.register([
{
name: 'KAFKA_SERVICE',
transport: Transport.KAFKA,
options: {
client: {
clientId: 'nestjs-producer',
brokers: ['localhost:9092'],
},
},
},
]),
],
providers: [ProducerService],
exports: [ProducerService],
})
export class AppModule {}
4. Creating a Kafka Consumer in NestJS
Let’s set up a consumer that listens to a Kafka topic and processes messages.
4.1 Kafka Microservice Configuration
In main.ts
, configure the app to act as a Kafka microservice:
import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';
import { kafkaConfig } from './kafka.config';
async function bootstrap() {
const app = await NestFactory.createMicroservice(AppModule, kafkaConfig);
await app.listen();
}
bootstrap();
4.2 Kafka Consumer Controller
Create consumer.controller.ts
:
import { Controller } from '@nestjs/common';
import { EventPattern, Payload } from '@nestjs/microservices';
@Controller()
export class ConsumerController {
@EventPattern('my-topic')
handleMessage(@Payload() message: any) {
console.log('Received message:', message.value);
// Process the message here
}
}
Add this controller to your AppModule
.
5. Testing the Kafka Integration
Make sure Kafka is running locally (or use Docker):
# Start Kafka using Confluent or local setup
$ kafka-topics.sh --create --topic my-topic --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1
Then, inject ProducerService
into a controller or another service and trigger sendMessage()
:
this.producerService.sendMessage('my-topic', { userId: 123, action: 'signup' });
Check the consumer logs for the incoming message.
6. Best Practices
Use DTOs and validation for strong message contracts.
Enable retry and error handling in consumer logic.
Avoid tight coupling — topics should be semantically meaningful.
Use Schema Registry if you're enforcing Avro or Protobuf formats.
Conclusion
Integrating Kafka into your NestJS application unlocks powerful real-time capabilities and decouples your architecture for better scalability and maintainability. Whether you’re building event-driven services, handling high-velocity logs, or reacting to user events, Kafka and NestJS provide a solid foundation.
Subscribe to my newsletter
Read articles from Muhire Josué directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Muhire Josué
Muhire Josué
I am a backend developer, interested in writing about backend engineering, DevOps and tooling.