Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Practical Real-time Data Processing and Analytics

You're reading from   Practical Real-time Data Processing and Analytics Distributed Computing and Event Processing using Apache Spark, Flink, Storm, and Kafka

Arrow left icon
Product type Paperback
Published in Sep 2017
Publisher Packt
ISBN-13 9781787281202
Length 360 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
Shilpi Saxena Shilpi Saxena
Author Profile Icon Shilpi Saxena
Shilpi Saxena
Saurabh Gupta Saurabh Gupta
Author Profile Icon Saurabh Gupta
Saurabh Gupta
Arrow right icon
View More author details
Toc

Table of Contents (14) Chapters Close

Preface 1. Introducing Real-Time Analytics FREE CHAPTER 2. Real Time Applications – The Basic Ingredients 3. Understanding and Tailing Data Streams 4. Setting up the Infrastructure for Storm 5. Configuring Apache Spark and Flink 6. Integrating Storm with a Data Source 7. From Storm to Sink 8. Storm Trident 9. Working with Spark 10. Working with Spark Operations 11. Spark Streaming 12. Working with Apache Flink 13. Case Study

IOT – thoughts and possibilities

The Internet of Things: the term that was coined in 1999 by Kevin Ashton, has become one of the most promising door openers of the decade. Although we had an IoT precursor in the form of M2M and instrumentation control for industrial automation, the way IoT and the era of connected smart devices has arrived, is something that has never happened before. The following figure will give you a birds–eye view of the vastness and variety of the reach of IoT applications:

We are all surrounded by devices that are smart and connected; they have the ability to sense, process, transmit, and even act, based on their processing. The age of machines that was a science fiction a few years ago has become reality. I have connected vehicles that can sense and get un–locked/locked if I walk to them or away from them with keys. I have proximity sensing beacons in my supermarkets which sense my proximity to shelf and flash the offers to my cell phone. I have smart ACs that regulate the temperature based on the number of people in the room. My smart offices save electricity by switching the lights and ACs off in empty conference rooms. The list seems to be endless and growing every second.

At the heart of it IoT, is nothing but an ecosystem of connected devices which have the ability to communicate over the internet. Here, devices/things could be anything, like a sensor device, a person with a wearable, a place, a plant, an animal, a machine – well, virtually any physical item you could think of on this planet can be connected today. There are predominantly seven layers to any IoT platform; these are depicted and described in the following figure:

Following is quick description of all the 7 IoT application layers:

  • Layer 1: Devices, sensors, controllers and so on
  • Layer 2: Communication channels, network protocols and network elements, the communication, and routing hardware — telecom, Wi–Fi, and satellite
  • Layer 3: Infrastructure — it could be in-house or on the cloud (public, private, or hybrid)
  • Layer 4: Here comes the big data ingestion layer, the landing platform where the data from things/devices is collected for the next steps
  • Layer 5: The processing engine that does the cleansing, parsing, massaging, and analysis of the data using complex processing, machine learning, artificial intelligence, and so on, to generate insights in form of reports, alerts, and notifications
  • Layer 6: Custom apps, the pluggable secondary interfaces like visualization dashboards, downstream applications, and so on form part of this layer
  • Layer 7: This is the layer that has the people and processes that actually act on the insights and recommendations generated from the following systems

At an architectural level, a basic reference architecture of an IOT application is depicted in the following image:

In the previous figure, if we start with a bottom up approach, the lowest layers are devices that are sensors or sensors powered by computational units like RaspberryPi or Ardunio. The communication and data transference is generally, at this point, governed by lightweight options like Messaging Queue Telemetry Transport (MQTT) and Constrained Application protocol (CoAP) which are fast replacing the legacy options like HTTP. This layer is actually in conjunction to the aggregation or bus layer, which is essentially a Mosquitto broker and forms the event transference layer from source, that is, from the device to the processing hub. Once we reach the processing hub, we have all the data at the compute engine ready to be swamped into action and we can analyse and process the data to generate useful actionable insights. These insights are further integrated to web service API consumable layers for downstream applications. Apart from these horizontal layers, there are cross–cutting layers which handle the device provisioning and device management, identity and access management layer.

Now that we understand the high–level architecture and layers for standard IoT application, the next step is to understand the key aspects where an IoT solution is constrained and what the implications are on overall solution design:

  • Security: This is a key concern area for the entire data-driven solution segment, but the concept of big data and devices connected to the internet makes the entire system more susceptible to hacking and vulnerable in terms of security, thus making it a strategic concern area to be addressed while designing the solution at all layers for data at rest and in motion.
  • Power consumption/battery life: We are devising solutions for devices and not human beings; thus, the solutions we design for them should be of very low power consumption overall without taxing or draining battery life.
  • Connectivity and communication: The devices, unlike humans, are always connected and can be very chatty. Here again, we need a lightweight protocol for overall communication aspects for low latency data transfer.
  • Recovery from failures: These solutions are designed to run for billions of data process and in a self–sustaining 24/7 mode. The solution should be built with the capability to diagnose the failures, apply back pressure and then self–recover from the situation with minimal data loss. Today, IoT solutions are being designed to handle sudden spikes of data, by detecting a latency/bottle neck and having the ability to auto–scale–up and down elastically.
  • Scalability: The solutions need to be designed in a mode that its linearly scalable without the need to re–architect the base framework or design, the reason being that this domain is exploding with an unprecedented and un–predictable number of devices being connected with a whole plethora of future use cases which are just waiting to happen.

Next are the implications of the previous constraints of the IoT application framework, which surface in the form of communication channels, communication protocols, and processing adapters.

In terms of communication channel providers, the IoT ecosystem is evolving from telecom channels and LTEs to options like:

  • Direct Ethernet/WiFi/3G
  • LoRA
  • Bluetooth Low Energy (BLE)
  • RFID/Near Field communication (NFC)
  • Medium range radio mesh networks like Zigbee

For communication protocols, the de–facto standard that is on the board as of now is MQTT, and the reasons for its wide usage are evident:

  • It is extremely light weight
  • It has very low footprint in terms of network utilization, thus making the communication very fast and less taxing
  • It comes with a guaranteed delivery mechanism, ensuing that the data will eventually be delivered, even over fragile networks
  • It has low power consumption
  • It optimizes the flow of data packets over the wire to achieve low latency and lower footprints
  • It is a bi–directional protocol, and thus is suited both for transferring data from the device as well as transferring the data to the device
  • Its better suited for a situation in which we have to transmit a high volume of short messages over the wire

Edge analytics

Post evolution and IOT revolution, edge analytics are another significant game changer. If you look at IOT applications, the data from the sensors and devices needs to be collated and travels all the way to the distributed processing unit, which is either on the premises or on the cloud. This lift and shift of data leads to significant network utilization; it makes the overall solution latent to transmission delays.

These considerations led to the development of a new kind of solution and in turn a new arena of IOT computations — the term is edge analytics and, as the name suggests, it's all about pushing the processing to the edge, so that the data is processed at its source.

The following figure shows the bifurcation of IOT into:

  • Edge analytics
  • Core analytics

As depicted in the previous figure, the IOT computation is now divided into segments, as follows:

  • Sensor–level edge analytics: Wherein data is processed and some insights are derived at the device level itself
  • Edge analytics: These are the analytics wherein the data is processed and insights are derived at the gateway level
  • Core analytics: This flavour of analytics requires all data to arrive at a common compute engine (distributed storage and distributed computation) and then the high–complexity processing is done to derive actionable insights for people and processes

Some of the typical use cases for sensor/edge analytics are:

  • Industrial IOT (IIOT): The sensors are embedded in various pieces of equipment, machinery, and sometimes even shop floors. The sensors generate data and the devices have the innate capability to process the data and generate alerts/recommendations to improve the performance or yield.
  • IoT in health care: Smart devices can participate in edge processing and can contribute to detection of early warning signs and raise alerts for appropriate medical situations
  • In the world of wearable devices, edge processing can make tracking and safety very easy

Today, If I look around, my world is surrounded by connected devices—like my smart AC, my smart refrigerator, and smart TV; they all send out the data to a central hub or my mobile phone and are easily controllable from there. Now, the things are actually getting smart; they are evolving from being connected, to being smart enough to compute, process, and predict. For instance, my coffee maker is smart enough to be connected to my car traffic, my office timing, so that it predicts my daily routine and my arrival time and has hot fresh coffee ready the moment I need it.

You have been reading a chapter from
Practical Real-time Data Processing and Analytics
Published in: Sep 2017
Publisher: Packt
ISBN-13: 9781787281202
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime