Streaming from a database with Kafka Connect
In this section, we will read all data that is generated in a Postgres table in real time with Kafka Connect. First, it is necessary to build a custom image of Kafka Connect that can connect to Postgres. Follow these steps:
- Let’s create a different folder for this new exercise. First, create a folder named
connect
and another folder inside it namedkafka-connect-custom-image
. Inside the custom image folder, we will create a new Dockerfile with the following content:FROM confluentinc/cp-kafka-connect-base:7.6.0 RUN confluent-hub install --no-prompt confluentinc/kafka-connect-jdbc:10.7.5 \ && confluent-hub install --no-prompt confluentinc/kafka-connect-s3:10.5.8
This Docker file bases itself on the confluent Kafka Connect image and installs two connectors – a JDBC source/sink connector and a sink connector for Amazon S3. The former is necessary to connect to a database while the latter will be very handy for delivering...