DStream best practices
- Setting the right batch interval is most crucial for Spark Streaming. Your batch processing time should be less than the batch interval. You should monitor end-to-end delay for each batch, and if they are consistent and comparable to the batch size, your system can be considered stable. If your batch processing time is bigger than your batch interval , you will run out of memory. You can use
spark.streaming.receiver.maxRate
to limit the rate of the receiver. - Transformations will determine the amount of memory used by Spark Streaming. If you are maintaining a large key table using
updateStateByKey,
do account for the memory required. - Each Spark receiver runs within an executor and needs a single core. If you are configuring parallel reads using multiple receivers, make sure that
spark.cores.max
is configured by taking the receiver slots in the account. - Spark generates
N
number of blocks pern
batch interval milliseconds. For example, during a 5 millisecond batch interval...