Apache Kafka (Part -2)

Apache Kafka (Part -2)

CLI Implementation of Kafka

ยท

3 min read

In the previous blog, we discussed the introduction, architecture and use cases of Apache Kafka. One of the important use cases of Kafka is producing and consuming messages from Kafka topics, we will dive deep into the practical implementation of Kafka using CLI.

Getting Started with Kafka CLI

The Kafka CLI is a collection of shell scripts that interact with the Kafka server using the Kafka protocol. The scripts are located in the Kafka installation directory under the "bin" folder. Before using Kafka CLI, you must download and install Kafka on our system.

Step 1: Start a Zookeeper service

As we know from the previous blog the zookeeper is the one who tracks the status of nodes in the Kafka cluster and maintains a list of Kafka topics and messages. So we need to, first of all, start the zookeeper service by using the below command after going inside the Kafka folder.

$ bin/zookeeper-server-start.sh config/zookeeper.properties

Step 2: Start a Kafka broker service

After starting the zookeeper, we need to start the Kafka server (this will run a Kafka broker service) on another terminal:

$ bin/kafka-server-start.sh config/server.properties

This command starts the Kafka server and loads the configuration from the "config/server.properties" file.

Step 3: Managing Kafka Topics

One of the primary use cases of the Kafka CLI is managing Kafka topics. Topics are used to organize data in Kafka, and each topic is partitioned across multiple Kafka brokers.

To create a new Kafka topic, you can use the following command:

$ bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic tano_topic

This command creates a new Kafka topic named "tano_topic" with one partition and one replica factor. The "--bootstrap-server" flag specifies the Kafka broker to use for topic creation.

Step 4: Producing messages

To produce messages to a Kafka topic ("tano_topic"), you can use the following command:

$ bin/kafka-console-producer.sh --broker-list localhost:9092 --topic tano_topic

This opens up a console where you can type messages to be produced to the topic ("tano_topic"). With every new message produced offset is a count which gets increased. Publishing messages on a topic is also known as writing events on a topic.

Step 5: Consuming messages

To consume messages to a Kafka topic ("tano_topic"), you can use the following command:

$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic tano_topic --from-beginning

This opens up a console where you can consume messages from the "tano_topic" topic starting from the beginning. You can also consume messages from a specific offset by specifying the "--offset" flag.

You can list all the topics in a Kafka cluster using the following command:

$ bin/kafka-topics.sh --list --bootstrap-server localhost:9092

Conclusion

In this blog, we explored the implementation of the Kafka CLI and how it can be used to manage Kafka clusters. We covered the basics of managing Kafka topics, producing and consuming messages, and described some of the commonly used Kafka CLI commands.

The Kafka CLI is a powerful tool for managing Kafka clusters, and it is an essential tool for developers working with Kafka. By mastering the Kafka CLI, developers can easily manage Kafka topics, produce and consume messages, and troubleshoot Kafka clusters.

ย