Hopping windows are tumbling windows that overlap. They allow you to set specific commands and conditions, such as every 5 minutes, give me the count of the sensor readings over the last 10 minutes. To make a hopping window the same as a tumbling window, you would make the hop size the same as the window size, as shown in the following diagram:
Stream Analytics
The following Stream Analytics example shows a count of messages over a 10-minute window. This count happens every 5 minutes:
SELECT EventTime, Count(*) AS Count
FROM DeviceStream TIMESTAMP BY CreatedAt
GROUP by EventTime, HopingWindow(minuites, 10, 5)
Spark
In PySpark, this would be done through a window function. The following example shows a Spark DataFrame that is windowed, producing an entry in a new entry in a DataFrame for every 5 minutes spanning a 10-minute period:
from pyspark.sql.functions import *
windowedDF = eventsDF.groupBy(window("eventTime", "10 minute", "5 minute")).count()