Getting started with Kafka
Kafka is a popular open source platform for building real-time data pipelines and streaming applications. In this section, we will learn how to get a basic Kafka environment running locally using docker-compose
so that you can start building Kafka producers and consumers.
docker-compose
is a tool that helps define and run multi-container Docker applications. With compose, you use a YAML file to configure your application’s services then spin everything up with one command. This allows you to avoid having to run and connect containers manually. To run our Kafka cluster, we will define a set of nodes using docker-compose
. First, create a folder called multinode
(just to keep our code organized) and create a new file called docker-compose.yaml
. This is the regular file that docker-compose
expects to set up the containers (the same as Dockerfile for Docker). To improve readability, we will not show the entire code (it is available at https://github...