Practical lab
In these practical labs, we will create common tasks that involve integrating Kafka and Spark to build Delta tables:
- Connect to Confluent Kafka using Spark. Use Schema Registry to create a Spark job. This should ingest data from a topic and write to a delta table.
- Use the Delta Sink Connector to finalize the table.
It is your task to write a Spark job that ingests events from a Confluent Kafka topic and write it out to a Delta table.