Reading data from Event Hubs for Kafka
In this recipe, you will learn how Event for Kafka provides an endpoint that is compatible with Kafka and can be used as a streaming source. We will learn how Azure Databricks can consume the data from Event Hubs for Apache Kafka.
Getting ready
Before starting, you need to ensure that you have contributor access to the subscription or owner of the resource group and have gone through the Technical requirements section:
- You can create an Event Hub namespace and Event Hub within the namespace in accordance with the Microsoft documentation available at https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-create.
- You can use the Python script, https://github.com/PacktPublishing/Azure-Databricks-Cookbook/blob/main/Chapter04/PythonCode/Kafkaeventhub_producer.py, which will push the data to Event Hub for Kafka as a streaming data producer.
- To run the Python script, you need the Event Hub connection string, which we will...