Kakfa quick start using Docker

Apache Kafka, a distributed streaming platform, is a powerful tool for building real-time data pipelines and streaming applications.

In this tutorial we will be using docker to run kafka cluster and create consumer to write.

Prerequisites

Make sure docker is installed on your system.

First create a docker-compose file to run docker and zookeeper service
version: '1'

services:
  zookeeper:
    image: wurstmeister/zookeeper:latest
    ports:
      - "2181:2181"

  kafka:
    image: wurstmeister/kafka:latest
    ports:
      - "9092:9092"
    expose:
      - "9093"
    environment:
      KAFKA_ADVERTISED_LISTENERS: INSIDE://kafka:9093,OUTSIDE://localhost:9092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT
      KAFKA_LISTENERS: INSIDE://0.0.0.0:9093,OUTSIDE://0.0.0.0:9092
      KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_CREATE_TOPICS: "topic_name:1:1"
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
Run the compose file
docker-compose up -d

As your services running inside docker to verify you can use a docker command

docker ps

Create a kafka topic

Run this command to create a topic inside kafka cluster

docker exec -it <container-id-of-kafka> /opt/kafka/bin/kafka-topics.sh --create --zookeeper zookeeper:2181 --replication-factor 1 --partitions 1 --topic my-topic

3 is ideal number for replication-factor

Produce a message

Run command to produce messages

docker exec -it <container-id-of-kafka> /opt/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic topic_name

Consume messages

docker exec -it <container-id-of-kafka> /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic topic_name --from-beginning

Watch Demo Video

0
Subscribe to my newsletter

Read articles from Harshit Tripathi directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Harshit Tripathi
Harshit Tripathi

Coding