Before writing code, let's remember the project requirements for the stream processing application. Recall that customer sees BTC price events happen in the customer's web browser and are dispatched to Kafka via an HTTP event collector. Events are created in an environment out of the control of Doubloon. The first step is to validate that the input events have the correct structure. Remember that defective events could create bad data (most data scientists agree that a lot of time could be saved if input data were clean).
Setting up the project
Getting ready
Putting it all together, the specification is to create a stream application which does the following:
- Reads individual events from a Kafka topic called raw...