Kafka Hello World

Alex Woods

Alex Woods

January 05, 2020


Apache Kafka is one of the most groundbreaking technologies we software developers get to work with these days. It’s used for streaming applications, as well as for piping data all throughout a system.

It’s likely to become ubiquitous in systems above a certain level of complexity. So, it’s hands down one of the best things you can spend time learning these days.

I intend to use this article as a snowball and grow this hello world example into a full Kafka Streams application, demonstrating the power and idiosyncrasies of Kafka and especially, Kafka Streams.

Prerequisites

You will need to install Docker, as well as the Kafka Command Line Tools.

Step One: Run Kafka

To run Kafka we’ll need a Kafka broker and a Zookeeper instance. Kafka uses Zookeeper to manage the cluster of brokers, but there are plans to get away from it.

This step is simplified because we’ll use Docker. There’s a fantastic project that’s setup all the Docker compose yaml for us to just simply run it with one command. Clone the project, and run:

docker-compose -f zk-single-kafka-single.yml up

This will run Kafka on port 9092. In a follow up article we’ll fork this project and build a streaming application off of it.

Step Two: Topics

In Kafka, a topic is the fundamental abstraction; it represents a stream of messages. It’s on a similar level to a table in a relational database. Systems that use Kafka are made up of many topics, and Kafka itself uses a lot of topics internally.

Imagine a real-time udpate of the Uber car moving on your phone as you’re waiting for a ride. That might have been the result of a single Kafka message of the car’s GPS arriving at a topic, which then unleashes other messages, and eventually this data pipeline delivers an update to your cell phone.

Create a Topic

The easiest way to quickly create and write to a topic is using the Kafka Command Line Tools (follow link for installation).

kafka-console-producer --broker-list localhost:9092 --topic foo

We have to specify a list of the brokers and the topic name, which is foo in this case. A shell will open up, type a message to put on the topic.

> hello world

You can hit ctrl+c to exit the shell.

Consume from a Topic

It’s the sophisticated chains of producers to topics to consumers (and very advanced consumers, like Kafka Streams) that show the real power of Kafka. But for now, we’ll start with the most basic consumer possible.

With the Kafka Command Line Tools, this is pretty simple.

kafka-console-consumer --bootstrap-server localhost:9092 --topic foo --from-begin
ning // hello world

It’ll sit there waiting for new messages, just like a normal Kafka consumer, so you’ll have to hit ctrl+c again to exit it.

Congratulations, you just wrote and consumed from your first topic!

Want to know when I write a new article?

Get new posts in your inbox