All these require an intelligent vehicle health telemetry analysis which you can implement using Azure Streaming.
The Stream Analytics tools for Visual Studio help prepare, build, and deploy real-time events on Azure. Optionally, the tools enable you to monitor the streaming job using local sample data as job input testing as well as real-time monitoring, job metrics, diagram view, and so on. This tool provides a complete development setup for the implementation and deployment of real-world Azure Stream Analytics jobs using Visual Studio.
Post the installation of the Stream Analytics tool, a new stream analytics job can be created in Visual Studio.
2. Next, the job name, project, and solution location should be provided. Under Solution menu, you may also select options such as Add to solution or Create new instance apart from Create new solution from the available drop-down menu during Visual Studio Stream Analytics job creation:
3. Once the ASA job is created, in the Solution Explorer, the job topology folder structure could be viewed as Inputs (job input), Outputs (job output), JobConfig.json, Script.asaql (Stream Analytics Query file), Azure Functions (optional), and so on:
4. Next, provide the job topology data input and output event source settings by selecting Input.json and Output.json from Inputs and Outputs directories, respectively.
5. For a Vehicle Telemetry Predictive Analysis demo using an Azure Stream Analytics job, we need to take two different job data streams. One should be a Stream type for an illimitable sequence of real-time events processed through Azure Event Hub along with Hub policy name, policy key, event serialization format, and so on:
To assign the streaming analytics query definition, the Script.asasql file from the ASA project should be selected by specifying the data and reference stream input joining operation along with supplying analyzed output to Blob storage as configured in job properties.
For connected car and real-time predictive Vehicle Telemetry Analysis, there's a necessity to specify opportunities for new solutions in terms of how a car could be shipped globally with the required smart hardware to connect to the internet within the next few years. How the embedded connections could define Vehicle Telemetry predictive health status so automotive companies will be able to collect data on the performance of cars, to send interactive updates and patches to car's instrumentation remotely, and just to avoid car equipment damage with precautionary measures with prior notification through intelligent vehicle health telemetry analysis using Azure Streaming.
The solution architecture of the Co:tUlected Car-Vehicle Telemetry Analysis case study used in this demo, with Azure Stream Analytics for real-time predictive analysis, is as follows:
2. To run the Stream Analytics job query locally, first select Add Local Input by right-clicking on the ASA project in VS Solution Explorer, and choose to Add Local Input:
3. Define the local input for each Event Hub Data Stream and Blob storage data and execute the job query locally before publishing it in Azure:
4. After adding each local input test data, you can test the Stream Analytics job query locally in VS editor by clicking on the Run Locally button in the top left corner of VS IDE:
So, specify the following schema during the designing of a connected car streaming job query with Stream Analytics using parameters such as Vehicle Index no, Model, outside temperature, engine speed, fuel meter, tire pressure, and brake status, by defining INNER join with Event Hub data streams along with Blob storage reference streams containing vehicle model information:
Select input.vin, BlobSource.Model, input.timestamp, input.outsideTemperature,
input.engineTemperature, input.speed, input.fuel, input.engineoil,
input.tirepressure, input.odometer, input.city, input.accelerator_pedal_position, input.parking_brake_status, input.headlamp_status, input.brake_pedal_status,
input.transmission_gear_position, input.ignition_status, input.windshield_wiper_status, input.abs into output from input join BlobSource
on input.vin = BlobSource.VIN
The query could be further customized for complex event processing analysis in terms of defining windowing concepts like Tumbling window function, which assigns equal length non-overlapping series of events in streams with a fixed time slice.
The following Vehicle Telemetry analytics query will specify a smart car health index parameter with complex streams from a specified two-second timestamp interval in the form of a fixed length series of events:
select BlobSource.Model, input.city,count(vin) as cars, avg(input.engineTemperature) as engineTemperature, avg(input.speed) as Speed, avg(input.fuel) as Fuel, avg(input.engineoil) as EngineOil,avg(input.tirepressure) as TirePressure, avg(input.odometer) as Odometer
into EventHubOut from input join BlobSource
on input.vin = BlobSource.VIN group by BlobSource.model, input.city, TumblingWindow(second,2)
The following Vehicle Telemetry analytics query will specify a smart car health index parameter with complex streams from a specified two-second timestamp interval in the form of a fixed length series of events:
5. The query could be executed locally or submitted to Azure. While running the job locally, a Command Prompt will appear asserting the local Stream Analytics job's running status, with the output data folder location:
6. If run locally, the job output folder would contain two files in the project disk location within the ASALocalRun directory named, with the current date timestamp. Two output files would be present in .csv and .json formats respectively:
Now, if submitted the job to Azure from the Stream Analytics project in Visual Studio, it offers a beautiful job dashboard while providing an interactive job diagram view, job metrics graph, and errors (if any).
The Vehicle Telemetry Predictive Health Analytics job dashboard in Visual Studio provides a nice job diagram with Real-Time Insights of events, with a display refreshed at a minimum rate of every 30 minutes:
The Stream Analytics job metrics graph provides interactive insights on input and output events, out of order events, late events, runtime errors, and data conversion errors related to the job as appropriate:
For Connected Car-Predictive Vehicle Telemetry Analytics, you may configure the data input streams processed with complex events by using a definite timestamp interval in a non-overlapping mode such as Tumbling window over a two-second time slicer. The output sink should be configured as Service Bus Event Hub in a data partitioning unit of 32, with a maximum message retention period of 7 days. The output job sink processed events in Event Hub could be archived as well in Azure blob storage for a long-term infrequent access perspective:
The Azure Service Bus, Event Hub job output metrics dashboard view configured for vehicle telemetry analysis is as follows:
Optionally, a Stream Analytics complete project clone can also be generated by clicking on the Generate Project icon from the top menu of the job dashboard.
This article is an excerpt from the book, Stream Analytics with Microsoft Azure, written by Anindita Basak, Krishna Venkataraman, Ryan Murphy, and Manpreet Singh. This book provides lessons on Real-time data processing for quick insights using Azure Stream Analytics.
Say hello to Streaming Analytics
How to get started with Azure Stream Analytics and 7 reasons to choose it