Try Kafka with Django

Bisesh AdhikariBisesh Adhikari
5 min read

When building real-time applications, especially those that involve streaming data like live GPS location tracking, we need a system that is fast, scalable, and reliable. One powerful tool for this is Apache Kafka.

In this blog, I’ll explain what Kafka is, why it makes sense for a Django-based location tracking project, and how to integrate it step-by-step.

What is Apache Kafka?

Apache Kafka is a distributed event streaming platform. It is widely used to build real-time data pipelines and streaming applications. At its core, Kafka works like a high-performance message broker.

Kafka enables applications to produce and consume streams of messages in real-time. It’s built for scalability, fault tolerance, and high throughput.

Key Concepts:

  • Producer: An application that sends (produces) messages to Kafka.

  • Topic: A named stream to which messages are sent by producers and from which messages are read by consumers.

  • Consumer: An application that reads (consumes) messages from Kafka.

  • Broker: A Kafka server that stores and distributes messages.

Kafka is highly suitable for decoupling data producers from data consumers and handling high volumes of real-time data.

Why Use Kafka in a Django Real-Time Location Tracker?

In a real-time GPS tracking system:

  • Users (drivers) continuously send their current GPS location.

  • Other users (clients, admin) should see these updates in real time.

While Django Channels can handle WebSockets for real-time communication, adding Kafka brings several advantages:

  • Buffering: Kafka can act as a buffer between location producers and consumers.

  • Scalability: Kafka can handle thousands of concurrent data streams more efficiently.

  • Decoupling: Producers and consumers are not directly connected, allowing more flexible and maintainable architecture.

Before implementing, ensure you have:

  1. Kafka and Zookeeper running (or via Docker)

  2. Redis server installed and running

  3. Django project with channels and confluent-kafka installed:

pip install django channels redis confluent-kafka

Kafka Setup Using Docker Compose

Kafka requires a metadata service for coordination. Traditionally, this has been Zookeeper. Confluent also offers KRaft mode, which is Zookeeper-free, but in this example, we’ll use the standard Docker Compose setup with Zookeeper.

# docker-compose.yml
version: '3'
services:
  zookeeper:
    image: confluentinc/cp-zookeeper:7.6.0
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
    ports:
      - "2181:2181"

  kafka:
    image: confluentinc/cp-kafka:7.6.0
    ports:
      - "9092:9092"
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1

To run the containers:

docker-compose up -d

How Your Real-Time Location Tracker Works (Overview)

  1. A user visits / and enters their name.

  2. Their browser starts sending their GPS coordinates to the backend via fetch() every 3 seconds.

  3. Django receives this location and sends it to Kafka (driver_locations topic).

  4. A Kafka consumer thread running inside Django consumes these messages.

  5. Based on driver_id, the location is broadcasted to a Django Channels group.

  6. WebSocket clients listening for that driver receive the real-time location.

  7. Leaflet updates the marker on the map in real time.

Step-by-Step Code Explanation


1. home.html – Starting Point for the Driver

<input type="text" id="driverName" />
<button onclick="startTracking()">Start Sending Location</button>

<script>
function startTracking() {
    const driverId = document.getElementById('driverName').value;

    setInterval(() => {
        navigator.geolocation.getCurrentPosition((pos) => {
            fetch('/api/send-location/', {
                method: 'POST',
                headers: { 'Content-Type': 'application/json' },
                body: JSON.stringify({
                    driver_id: driverId,
                    latitude: pos.coords.latitude,
                    longitude: pos.coords.longitude
                })
            });
        });
    }, 3000);

    alert("Share /location/" + driverId);
}
</script>

This page allows a driver to:

  • Enter their name (driver_id)

  • Send their GPS location every 3 seconds using fetch() and navigator.geolocation


2. views.py – Kafka Producer (Sending to Kafka)

producer = KafkaProducer(
    bootstrap_servers='localhost:9092',
    value_serializer=lambda v: json.dumps(v).encode('utf-8')
)

@csrf_exempt
def send_location(request):
    if request.method == 'POST':
        data = json.loads(request.body)
        data['timestamp'] = time.time()
        producer.send('driver_locations', data)
        producer.flush()
        return JsonResponse({'status': 'sent to Kafka'})

Breakdown:

  • A KafkaProducer is initialized using the Confluent client, pointing to the Kafka broker.

  • The send_location view accepts POST requests, appends a timestamp, and sends it to the topic driver_locations.


3. kafka_consumer.py – Kafka Consumer (Listening to Kafka)

def start_kafka_consumer():
    consumer = KafkaConsumer(
        'driver_locations',
        bootstrap_servers='localhost:9092',
        value_deserializer=lambda m: json.loads(m.decode('utf-8'))
    )
    channel_layer = get_channel_layer()

    for message in consumer:
        data = message.value
        driver_id = data.get("driver_id")
        if not driver_id:
            continue

        async_to_sync(channel_layer.group_send)(
            f"location_{driver_id}",
            {
                "type": "send_location",
                "data": data
            }
        )

What’s happening here:

  • The consumer subscribes to driver_locations topic.

  • On receiving a message:

    • It retrieves the driver_id

    • Broadcasts the message to a Django Channels group location_<driver_id>.

We start this as a background thread from apps.py:

def ready(self):
    from .kafka_consumer import start_consumer_thread
    start_consumer_thread()

This ensures the consumer starts when Django starts.


4. consumers.py – Django Channels WebSocket Handler

class LocationConsumer(AsyncWebsocketConsumer):
    async def connect(self):
        self.driver_id = self.scope['url_route']['kwargs']['driver_id']
        self.group_name = f"location_{self.driver_id}"
        await self.channel_layer.group_add(self.group_name, self.channel_name)
        await self.accept()

    async def disconnect(self, close_code):
        await self.channel_layer.group_discard(self.group_name, self.channel_name)

    async def send_location(self, event):
        await self.send(text_data=json.dumps(event['data']))

Purpose:

  • When someone opens /location/<driver_id>/, they connect to this WebSocket.

  • They join a group named location_<driver_id>.

  • When a message is received from Kafka, it is forwarded to their WebSocket.


5. map.html – Viewing Live Location

<script>
const socket = new WebSocket(`wss://${window.location.host}/ws/location/{{ driver_id }}/`);

socket.onmessage = function(event) {
    const data = JSON.parse(event.data);
    const { latitude, longitude } = data;

    if (marker) {
        marker.setLatLng([latitude, longitude]);
    } else {
        marker = L.marker([latitude, longitude]).addTo(map);
    }

    map.setView([latitude, longitude], 18);
};
</script>

This WebSocket:

  • Connects to Django Channels

  • Receives driver location updates

  • Updates the marker position using Leaflet.js


6. WebSocket Routing (routing.py)

from django.urls import re_path
from .consumers import LocationConsumer

websocket_urlpatterns = [
    re_path(r'ws/location/(?P<driver_id>\w+)/$', LocationConsumer.as_asgi()),
]

Defines the route pattern for WebSocket connection.


7. Django URLs (urls.py)

urlpatterns = [
    path('', views.home, name='home'),
    path('location/<str:driver_id>/', views.track_driver, name='track_driver'),
    path('api/send-location/', views.send_location),
]

🧪Testing the Project

  1. Run Kafka + Zookeeper

  2. Start Redis

  3. Run Django app

  4. Open /, enter name (e.g., bisesh), and allow location sharing

  5. Open /location/bisesh in another tab to see live tracking


1
Subscribe to my newsletter

Read articles from Bisesh Adhikari directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Bisesh Adhikari
Bisesh Adhikari

Student | Tech enthusiast | Aspiring software developer