Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Edge computing - Trick or Treat?

Save for later
  • 4 min read
  • 31 Oct 2018

article-image

According to IDC’s Digital Universe update, the number of connected devices is projected to expand to 30 billion by 2020 to 80 billion by 2025. IDC also estimates that the amount of data created and copied annually will reach 180 Zettabytes (180 trillion gigabytes) in 2025, up from less than 10 Zettabytes in 2015.

Thomas Bittman, vice president and distinguished analyst at Gartner Research, in a session on edge computing at the recent Gartner IT Infrastructure, Operations Management and Data Center Conference predicted, “In the next few years, you will have edge strategies-you’ll have to.” This prediction was consistent with a real-time poll conducted at the conference which stated that 25% of the audience uses edge computing technology and more than 50% plan to implement it within two years.

How does Edge computing work?


2018 marked the era of edge computing with the increase in the number of smart devices and the massive amounts of data generated by them. Edge computing allows data produced by the internet of things (IoT) devices to be processed near the edge of a user’s network. Instead of relying on the shared resources of large data centers in a cloud-based environment, edge computing will place more demands on endpoint devices and intermediary devices like gateways, edge servers and other new computing elements to encourage a complete edge computing environment.

Some use cases of Edge computing


The complex architecture of devices today demands a more comprehensive computing model to support its infrastructure. Edge computing caters to this need and reduces latency issues, overhead and cost issues associated with centralized computing options like the cloud.

A good example of this is the launch of the world’s first digital drilling vessel, the Noble Globetrotter I by London-based offshore drilling company- ‘Noble Drilling’. The vessel uses data to create virtual versions of some of the key equipment on board. If the drawworks on this digitized rig begins to fail prematurely, information based on a ‘digital twin’ of that asset will notify a team of experts onshore. The “digital twin” is a virtual model of the device that lives inside the edge processor and can point out to tiny performance discrepancies human operators may easily miss.

Keeping a watch on all pertinent data on a dashboard, the onshore team can collaborate with the rig’s crew to plan repairs before a failure.

Noble believes that this move towards edge computing will lead to a more efficient, cost-effective offshore drilling. By predicting potential failures in advance, Noble can avert breakdowns at and also spare the expense of replacing/ repairing equipment.

Another news that caught our attention was  Microsoft’s $5 billion investment in IoT to empower the intelligent cloud and the intelligent edge.  Azure Sphere is one of Microsoft’s intelligent edge solutions to power and protect connected microcontroller unit (MCU)-powered devices. MCU powered devices power everything from household stoves and refrigerators to industrial equipment and considering that there are 9 billion MCU-powered devices shipping every year, we need all the help we can get in the security spectrum!

That’s intelligent edge for you on the consumer end of the application spectrum.
2018 also saw progress in the development of edge computing tools and solutions across the spectrum, from hardware to software.

Take for instance OpenStack Rocky one of the most widely deployed open source cloud infrastructure software. It is designed to accommodate edge computing requirements by deploying containers directly on bare metal. OpenStack Ironic improves management and automation capabilities to bare metal infrastructure. Users can manage physical infrastructure just like they manage VMs, especially with new Ironic features introduced in Rocky.

Intel’s OpenVIVO computer vision toolkit is yet another example of using edge computing to help developers to streamline their deep learning inferences and deploy high-performance computer vision solutions across a wide range of use-cases. Baidu, Inc. released the Kunlun AI chip built to handle AI models for both, edge computing on devices and in the cloud via data centers.

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime

Edge computing - Trick or Treat?


However, edge computing does come with disadvantages like the steep cost of deploying and managing an edge network, security concerns and performing numerous operations.
The final verdict: Edge computing is definitely a treat when complement by embedded AI for enhancing networks to promote efficiency in analysis and improve security for business systems.

Intelligent Edge Analytics: 7 ways machine learning is driving edge computing adoption in 2018

Ubuntu 18.10 ‘Cosmic Cuttlefish’ releases with a focus on AI development, multi-cloud and edge deployments, and much more!