How to Install Kafka on Kali Linux Latest
Kafka is a distributed streaming platform that is used to build real-time streaming data pipelines and can be run on various operating systems. In this tutorial, we are going to learn how to install Kafka on Kali Linux Latest.
Prerequisites
- Kali Linux Latest
- Java 8 or later installed on your machine
- Downloaded Kafka from http://kafka.apache.org (Choose the latest stable release).
Step 1: Extract the Kafka Archive
After downloading Kafka, navigate to the directory where it was downloaded and extract the archive by running:
tar -xzf kafka_<version>.tgz
Note: Replace <version> with the version number of the Kafka release you downloaded.
Step 2: Navigate to the Kafka Directory
cd kafka_<version>
Step 3: Start ZooKeeper
Kafka uses ZooKeeper to manage and maintain configuration data. Start ZooKeeper by running the following command:
bin/zookeeper-server-start.sh config/zookeeper.properties
This will start ZooKeeper.
Step 4: Start Kafka Server
In a new terminal tab/window, navigate to the kafka_<version> directory and start the Kafka server by running the following command:
bin/kafka-server-start.sh config/server.properties
Step 5: Create a Kafka Topic
Kafka uses topics to manage streams of records. To create a topic, run the following command:
bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic test
This will create a topic named test.
Step 6: Produce and Consume Messages
To produce messages to the topic, run the following command:
bin/kafka-console-producer.sh --bootstrap-server localhost:9092 --topic test
This will start the producer console.
To consume messages from the topic, run the following command in a new terminal tab/window:
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --from-beginning
This will start the consumer console and display all the messages produced to the test topic.
Conclusion
Congratulations! You have successfully installed Kafka on Kali Linux Latest and produced and consumed data using Kafka.