loader image

Setup Apache Kafka Environment | Apache Kafka

Introduction

This article is about configuring and starting an Apache Kafka server on a Windows OS and Linux. This guide will also provide instructions to set up Java and Apache Zookeeper, and after the setup we will create a simple pipeline to test our installation.

Kafka on windows

Make sure you have the following prerequisites before starting installation steps

For this tutorial, we are assuming that Kafka is Unzipped in C: Drive and folder name is kafka_2.13-2.5.0

Installation Steps:

  1. Start a Command Prompt and change the directory to our Kafka directory like blew

2.Check if java is installed or not by “java -version” command

3.Change Edit Windows Environment Variables to run Kafka Commands From anywhere
a. Search on windows environment variables and click the highlighted button

b. Click On “Path” and “Edit”

C. Add New Variable with “/bin/windows” in your Kafka Full Directory

d. Click OK
Then you can run your Kafka commands from anywhere from any directory such as kafka-topics.bat

4. In Kafka directory We Will add New folder with name ”Data” and in this folder we will create another two new folders one of them named “zookeeper” and the other one is “Kafka” to save Kafka and Zookeeper metadata

5. Now we will edit the zookeeper properties to use the data folder. So, go to your Kafka directory, then under “Config” directory we will First Edit “zookeeper.properties” file and change “dataDir” to dataDir=C:/kafka_2.13-2.5.0/data/zookeeper

Then we will edit the Kafka properties to use the data folder. So, in the same “Config” directory we will now edit “server.properties” file and change “log.dirs” to log.dirs=C:/kafka_2.13-2.5.0/data/kafka

6. Let’s now start our Zookeeper Server by running the blew Command in Your Kafka directory and the blew screen should appear after the command run

zookeeper-server-start.bat config/zookeeper.properties

you will find that zookeeper running on port 2181

7. Open another terminal.. and start your Kafka Server by running the blew Command in Your Kafka directory and the blew screen should appear after the command run

kafka-server-start.bat config/server.properties

In the command output you will find KafkaServer id=0 and this is our broker id

8. Congratulations!!! Now Kafka Is installed successfully on your windows


Installing Kafka on Linux

For this tutorial, we are assuming that Kafka is Unzipped in home directory which is /home/ec2-user/
your home directory bath gonna be different so you can check it by using “pwd” command

Installation Steps:

  1. Download and install Java by running
sudo yum install java-1.8.0-openjdk.x86_64 java-1.8.0-openjdk-devel.x86_64

2. Run java –version command to make sure java is correctly installed

3. Download Kafka from this command

sudo wget https://downloads.apache.org/kafka/2.5.0/kafka_2.12-2.5.0.tgz

4. Move the downloaded file to your current directory to start installing it

mv Downloads/ kafka_2.13-2.5.0.tgz .

5. Extract the downloaded file by

tar -xvf kafka_2.12-2.5.0.tgz

6. edit .bashrc file from home directory to add environment variables

nano .bashrc

and at the very bottom of this file add “export PATH=/home/ec2-user/kafka_2.12-2.5.0/bin:$PATH” then save and exit
then you can run your Kafka commands from anywhere in your home directory such as kafka-topics.sh

7. In Kafka directory we will add new folder with name ”Data” and in this folder we will create another two new folders one of them names “zookeeper” and the other one is “Kafka” To Save Kafka and zookeeper Meta Data

cd kafka_2.12-2.5.0
mkdir data
mkdir data/zookeeper
mkdir data/kafka

8. Now go to Kafka directory then under “Config” directory we will edit the zookeeper properties to use the data folder, so we open zookeeper.properties file to edit

nano config/zookeeper.properties

Then we will change “dataDir” property to “dataDir=/home/ec2-user/kafka_2.12-2.5.0/data/zookeeper/

Then in the same “Config” directory we will edit server properties to use the data folder, so open server.properties file for edit

nano config/server.properties

then we will change “log.dirs” property to “log.dirs=/home/ec2-user/kafka_2.12-2.5.0/data/kafka/

9. Start your Zookeeper server by running the below command in your Kafka directory and you will find that zookeeper running on port 2181

zookeeper-server-start.sh config/zookeeper.properties

10. Open another terminal and start Your Kafka server by running the below command in your Kafka directory and from the command output you will find KafkaServer id=0 and this is our broker id

kafka-server-start.sh config/server.properties

11. Congratulations!!! Now Kafka Is installed successfully on your Linux

Creating topics, producers and consumers

So, lets test our environment.

Creating topics

To create topics you should have initially 4 inputs:

  • Zookeeper port ( which appear when we started zookeeper )
  • Topic name
  • Number of partitions
  • Replication factor

So, we will creating our first topic

kafka-topics --zookeeper 127.0.0.1:2181 --topic first_topic --create --partitions 3 --replication-factor 1

you can list your topics by

kafka-topics --zookeeper 127.0.0.1:2181 –list

or show details about a specific topic by using describe option

kafka-topics --zookeeper 127.0.0.1:2181 --topic first_topic –describe

and you can delete a topic by using delete option

kafka-topics --zookeeper 127.0.0.1:2181 --topic second_topic –delete

Note: windows have a bug when deleting topics it make all Kafka server crash  

Produce and consume from your topic

Now we will create a simple pipeline consist of one producer and one consumer to test

You can produce to your topic by this command

kafka-console-producer --broker-list 127.0.0.1:9092 --topic first_topic

Lets now create our consumer by the following command

kafka-console-consumer --bootstrap-server 127.0.0.1:9092 --topic first_topic

Okay well done now lets write Hello world! In out producer and click Enter

And let’s go to our consumer we will find the “Hello world” we produced

Interesting!!, Doesn’t it?

We can get all what we produce to our topic by using from begging option with our consumer as following

kafka-console-consumer --bootstrap-server 127.0.0.1:9092 --topic first_topic --from-beginning

And That’s It!! You can ask Any thing in our comments.

If you are not familiar with Kafka Components, you can check our previous article about Apache Kafka Architecture and Components

https://blog.datavalley.technology/2020/04/14/apache-kafka-components/
Facebook
Twitter

Unlimited access to educational materials for subscribers

Ask ChatGPT
Set ChatGPT API key
Find your Secret API key in your ChatGPT User settings and paste it here to connect ChatGPT with your Tutor LMS website.
Hi, Welcome back!
Forgot?
Don't have an account?  Register Now