Produce and consume messages from a Kafka topic using docker

Deen
3 min readMay 1, 2022

Hello Sudo, docker is an easy way to run applications in a containerized manner, and let us run the Apache Kafka server using docker in this article. Let us also consume and produce messages inside a docker container.

Copy the following and save it as “docker-compose.yml” file in your local folder.

version: '3'services:  zookeeper:      image: wurstmeister/zookeeper      container_name: zookeeper      ports:        - "2181:2181"  kafka:     image: wurstmeister/kafka     container_name: kafka     ports:       - "9092:9092"     environment:       KAFKA_ADVERTISED_HOST_NAME: localhost       KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181

After saving the file navigate to the folder, then we can run the Kafka inside the docker container using the following command.

docker-compose up

If you want to run Kafka server as daemon docker container run

docker-compose up -d

In order to create and delete topics, we need to get into the docker container. Once after running the Kafka server through docker open a new terminal window and we need to get into the Kafka’s docker container using the following command.

docker exec -it kafka /bin/sh

then navigate to the bin folder after getting into the docker container by,

cd /opt/kafka/bin/

You will get the current working path like the below image.

If you list the directory you can see files that are related to Kafka.

To create a topic execute

kafka-topics.sh --bootstrap-server localhost:9092  --topic myFirstTopic --create

To list all topics execute

kafka-topics.sh --list --bootstrap-server localhost:9092

To delete a topic execute

kafka-topics.sh --bootstrap-server localhost:9092  --delete --topic myFirstTopic

In order to produce messages on a particular topic, execute

kafka-console-producer.sh  --bootstrap-server localhost:9092  --topic myFirstTopic

then you will get like the following image

In order to consume a message from a particular topic, execute

(Open a new terminal and get into the docker container, change the directory to the bin folder and execute the following command)

kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic myFirstTopic --from-beginning

Now you can type messages in the producer and you will get those messages in the consumer

Producer:

Consumer:

Thanks for reading my first medium post, your feedback or most welcomed. Stay tuned for the next medium post on “Understanding topics, partitions and group-ids in Kafka”

--

--