⚡ Real-Time Data Streaming with Apache Kafka and C#


🧭 Introduction
In today’s fast-paced digital landscape, software systems need to communicate at scale, with minimal latency and rock-solid reliability. That’s exactly what Apache Kafka delivers—a distributed streaming platform optimized for high-throughput messaging across services.
This guide explores what Kafka is, how it works under the hood, and how to integrate it using C# and ASP.NET Core, including setup, examples, and interaction with Kafka Connect.
Let’s unlock the power of real-time event streaming!
📡 What Is Apache Kafka?
Apache Kafka is an open-source, distributed platform for real-time data streaming. Originally built at LinkedIn and later donated to the Apache Software Foundation, Kafka has become a foundational tool in large-scale event-driven architectures.
🧠 Deep Dive: How Kafka Works
Kafka uses a publish–subscribe model where systems interact asynchronously and reliably.
Producers send messages (events) to Kafka.
Events are routed into topics—logical channels of communication.
Topics are split into partitions, enabling parallelism and scalability.
Consumers read data from one or more partitions, in either real-time or batch processing.
🔄 Key Features
🌍 Common Use Cases
📈 Website and app behavior tracking
🕵️ Fraud detection in financial systems
🏬 Real-time inventory updates
📣 Microservice communication via events
📡 IoT sensor data collectio
🔧 System Requirements
To build and run Kafka applications with C#, you’ll need:
🐳 Kafka Setup with Docker
Create a file named docker-compose.yml
:
version: '3.8'
services:
zookeeper:
image: confluentinc/cp-zookeeper:7.5.0
environment:
ZOOKEEPER_CLIENT_PORT: 2181
kafka:
image: confluentinc/cp-kafka:7.5.0
ports:
- "9092:9092"
environment:
KAFKA_BROKER_ID: 1
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
Then launch Kafka locally:
docker-compose up -d
🧪 Kafka Producer with C
Create a new console project and install the client:
dotnet new console -n KafkaProducer
cd KafkaProducer
dotnet add package Confluent.Kafka
Sample producer code:
using Confluent.Kafka;
var config = new ProducerConfig { BootstrapServers = "localhost:9092" };
using var producer = new ProducerBuilder<Null, string>(config).Build();
for (int i = 0; i < 10; i++)
{
var result = await producer.ProduceAsync("demo-topic", new Message<Null, string>
{
Value = $"Message {i}"
});
Console.WriteLine($"Sent: {result.Value} → {result.TopicPartitionOffset}");
}
📥 Kafka Consumer with C
Create a new console consumer app:
dotnet new console -n KafkaConsumer
cd KafkaConsumer
dotnet add package Confluent.Kafka
Consumer code:
using Confluent.Kafka;
var config = new ConsumerConfig
{
BootstrapServers = "localhost:9092",
GroupId = "my-consumer-group",
AutoOffsetReset = AutoOffsetReset.Earliest
};
using var consumer = new ConsumerBuilder<Ignore, string>(config).Build();
consumer.Subscribe("demo-topic");
Console.WriteLine("Waiting for messages...");
while (true)
{
var msg = consumer.Consume();
Console.WriteLine($"Received: {msg.Value}");
}
🌐 Kafka with ASP.NET Core
📤 Producer with Web API
In Program.cs
, register the Kafka producer:
builder.Services.AddSingleton<IProducer<string, string>>(
new ProducerBuilder<string, string>(
new ProducerConfig { BootstrapServers = "localhost:9092" }).Build());
In your controller:
[ApiController]
[Route("api/[controller]")]
public class OrderController : ControllerBase
{
private readonly IProducer<string, string> _producer;
public OrderController(IProducer<string, string> producer)
{
_producer = producer;
}
[HttpPost]
public async Task<IActionResult> Post(Order order)
{
var json = JsonSerializer.Serialize(order);
await _producer.ProduceAsync("orders-topic", new Message<string, string>
{ Key = order.Id.ToString(), Value = json });
return Ok("Order sent to Kafka.");
}
}
📥 Consumer with Hosted Service
Create KafkaConsumerService.cs
:
public class KafkaConsumerService : BackgroundService
{
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
var config = new ConsumerConfig
{
BootstrapServers = "localhost:9092",
GroupId = "orders-group",
AutoOffsetReset = AutoOffsetReset.Earliest
};
using var consumer = new ConsumerBuilder<Ignore, string>(config).Build();
consumer.Subscribe("orders-topic");
while (!stoppingToken.IsCancellationRequested)
{
var result = consumer.Consume(stoppingToken);
Console.WriteLine($"Order received: {result.Value}");
}
}
}
Register it in Program.cs
:
builder.Services.AddHostedService<KafkaConsumerService>();
🔌 Kafka Connect + C# Integration
Kafka Connect allows Kafka to integrate with databases, REST APIs, cloud storage, and more—using pre-built connectors that can be managed via HTTP requests.
📡 Connect Example via C
Here’s how to configure a MySQL source connector using C# and Kafka Connect REST API:
using System.Net.Http;
using System.Text;
using System.Text.Json;
var client = new HttpClient();
var connector = new
{
name = "mysql-source-connector",
config = new
{
connector_class = "io.confluent.connect.jdbc.JdbcSourceConnector",
connection_url = "jdbc:mysql://localhost:3306/mydb",
connection_user = "admin",
connection_password = "secret123",
topic_prefix = "mysql-",
mode = "incrementing",
incrementing_column_name = "id"
}
};
var json = JsonSerializer.Serialize(connector);
var content = new StringContent(json, Encoding.UTF8, "application/json");
var response = await client.PostAsync("http://localhost:8083/connectors", content);
Console.WriteLine(await response.Content.ReadAsStringAsync());
Kafka Connect will start streaming data from your MySQL DB into Kafka—zero infrastructure coding needed.
✅ Conclusion
Apache Kafka is the backbone of modern data pipelines. Whether you're sending events from an ASP.NET Core API, integrating a backend system with Kafka Connect, or consuming data in C# microservices, this platform lets you build responsive, scalable, and decoupled applications.
This guide gives you everything you need—from concept to code—to master Kafka in .NET ecosystems.
#ApacheKafka #KafkaStreaming #DotNetKafka #RealTimeData #KafkaConnect #CSharpDev #ASPNetCore #Microservices #EventDrivenArchitecture #KafkaInProduction
Subscribe to my newsletter
Read articles from Johnny Hideki Kinoshita de Faria directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Johnny Hideki Kinoshita de Faria
Johnny Hideki Kinoshita de Faria
Technology professional with over 15 years of experience delivering innovative, scalable, and secure solutions — especially within the financial sector. I bring deep expertise in Oracle PL/SQL (9+ years), designing robust data architectures that ensure performance and reliability. On the back-end side, I’ve spent 6 years building enterprise-grade applications using .NET, applying best practices like TDD and clean code to deliver high-quality solutions. In addition to my backend strengths, I have 6 years of experience with PHP and JavaScript, allowing me to develop full-stack web applications that combine strong performance with intuitive user interfaces. I've led and contributed to projects involving digital account management, integration of VISA credit and debit transactions, modernization of payment systems, financial analysis tools, and fraud prevention strategies. Academically, I hold a postgraduate certificate in .NET Architecture and an MBA in IT Project Management, blending technical skill with business acumen. Over the past 6 years, I’ve also taken on leadership roles — managing teams, mentoring developers, and driving strategic initiatives. I'm fluent in agile methodologies and make consistent use of tools like Azure Boards to coordinate tasks and align team performance with delivery goals.