Streaming reads and writes to Delta tables
In this recipe, you will learn how to write streaming data coming from event hubs to Delta tables and later reading data from Delta tables as a streaming source that can be used by other downstream consumers. You will learn how data from multiple event hubs are written to the same Delta table.
Getting ready
Before starting you need to ensure you have contributor access to the subscription or are the owner of the resource group.
- You can follow along by running the cells in the
6.2-Streaming Read & Writes to Delta Table
notebook in the https://github.com/PacktPublishing/Azure-Databricks-Cookbook/tree/main/Chapter06/ folder. - You should set up Azure Event Hub for Kafka with two event hubs named eventhubsource1 and eventhubsource2. The following link has steps on how to create an event hub in Azure:
- https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-create
- You can use the Python script at https://github.com...