Multi-hop pipelines
A multi-hop pipeline is an architecture for building a series of streaming jobs chained together so that each job in the pipeline processes the data and improves the quality of the data progressively. A typical data analytics pipeline consists of multiple stages, including data ingestion, data cleansing and integration, and data aggregation. Later on, it consists of data science and machine learning-related steps, including feature engineering and machine learning training and scoring. This process progressively improves the quality of data until it is finally ready for end user consumption.
With Structured Streaming, all these stages of the data analytics pipelines can be chained together into a Directed Acyclic Graph (DAG) of streaming jobs. In this way, new raw data continuously enters one end of the pipeline and gets progressively processed by each stage of the pipeline. Finally, end user ready data exits from the tail end of the pipeline. A typical multi...