Scaling resources
Let's look at how to scale resources in Event Hubs, ASA, and Azure Databricks Spark.
Scaling in Event Hubs
There are two ways in which Event Hubs supports scaling:
- Partitioning: We have already learned how partitioning can help scale our Event Hubs instance by increasing the parallelism with which the event consumers can process data. Partitioning helps reduce contention if there are too many producers and consumers, which, in turn, makes it more efficient.
- Auto-inflate: This is an automatic scale-up feature of Event Hubs. As the usage increases, EventHub adds more throughput units to your Event Hubs instance, thereby increasing its capacity. You can enable this feature if you have already saturated your quota using the partitioning technique that we explored earlier, in the Processing across partitions section.
Next, let's explore the concept of throughput units.
What are throughput units?
Throughput units are units of capacity...