Monitor Apache Kafka messages using Elasticsearch/Logstash/Kibana stack

Set up an ELK dashboard to view your Apache Kafka messages.

What will we do?

In this article, we will learn how to set up the ELK stack to monitor Apache Kafka messages on the Kibana dashboard in real-time.

Please note that I am not covering how to install Apache Kafka or ELK stack. I assume that these are already installed on your machine.


  • Apache Kafka
  • ELK stack

This tutorial is based on the below versions.

  • Apache Kafka — 2.8.0
  • Elasticsearch — 7.14.0
  • Logstash — 7.14.0
  • Kibana — 7.14.0

I am running this entire stack on macOS Catalina. The steps should be similar on most platforms.

Steps to follow

For simplicity, you can choose to install these on a single machine. You can also install Kafka on another server. Similarly, you can install ELK on individual servers.

We will need a Kafka topic. Any messages written to this topic will be sent to the ELK stack for monitoring.

Here, I am creating a topic called myTopic.

shashank@MBP ~> kafka-topics --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic myTopic

We will need to edit the Logstash configuration file to let it know which topic to monitor.

For this, create a new file apache-kafka.conf (you can choose any name) & enter the below contents.

input {
kafka {
bootstrap_servers => "localhost:9092"
topics => "myTopic"
codec => json
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
geoip {
source => "clientip"
output {
elasticsearch {
hosts => ["localhost:9200"]

You can see how I have specified the bootstrap server & topic name. Since I am using a Kafka server running locally, I used localhost. You have to change it to the IP address/hostname of your Zookeeper if it is running on a different server.

Now start Logstash by specifying the location of apache-kafka.conf.

shashank@MBP ~> logstash -f /usr/local/Cellar/logstash-full/7.14.0/libexec/config/apache-kafka.conf

On macOS, you can start them by typing…

shashank@MBP ~> elasticsearchshashank@MBP ~> kibana

Now, produce Kafka messages so that those can be viewed on your Kibana dashboard. For this, enter the below command on macOS.

shashank@MBP ~> kafka-console-producer --topic myTopic --bootstrap-server localhost:9092

Now start typing your messages.

>I am sending a few messages to Kafka topic myTopic.
>You should see it on Kibana.
>Filter using Available fields to only see the messages.

To view our messages, let’s open the browser & point to http://localhost:5601/.

Then, click on the hamburger menu on the left side & select Discover.

Kibana Dashboard home-page.

Now, select logstash-* as the Index pattern. If you don’t see logstash-*, then create one by navigating to http://localhost:5601/app/management/kibana/indexPatterns/create.

Once the logstash-* index is created, go back to the Discover link on the left side & select this index.

Hurray!! You should now see your Kafka messages here.

Apache Kafka messages on Kibana Dashboard.

You can also apply the filter by selecting message from the Available fields. You just need to click the + icon against it.

Now we can monitor all our Kafka messages.

Apache Kafka messages.

DevOps Architect, Music/Book/Photography/Fitness lover & Blogger