Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Edge Computing Systems with Kubernetes

You're reading from   Edge Computing Systems with Kubernetes A use case guide for building edge systems using K3s, k3OS, and open source cloud native technologies

Arrow left icon
Product type Paperback
Published in Oct 2022
Publisher Packt
ISBN-13 9781800568594
Length 458 pages
Edition 1st Edition
Tools
Arrow right icon
Author (1):
Arrow left icon
Sergio Mendez Sergio Mendez
Author Profile Icon Sergio Mendez
Sergio Mendez
Arrow right icon
View More author details
Toc

Table of Contents (21) Chapters Close

Preface 1. Part 1: Edge Computing Basics
2. Chapter 1: Edge Computing with Kubernetes FREE CHAPTER 3. Chapter 2: K3s Installation and Configuration 4. Chapter 3: K3s Advanced Configurations and Management 5. Chapter 4: k3OS Installation and Configurations 6. Chapter 5: K3s Homelab for Edge Computing Experiments 7. Part 2: Cloud Native Applications at the Edge
8. Chapter 6: Exposing Your Applications Using Ingress Controllers and Certificates 9. Chapter 7: GitOps with Flux for Edge Applications 10. Chapter 8: Observability and Traffic Splitting Using Linkerd 11. Chapter 9: Edge Serverless and Event-Driven Architectures with Knative and Cloud Events 12. Chapter 10: SQL and NoSQL Databases at the Edge 13. Part 3: Edge Computing Use Cases in Practice
14. Chapter 11: Monitoring the Edge with Prometheus and Grafana 15. Chapter 12: Communicating with Edge Devices across Long Distances Using LoRa 16. Chapter 13: Geolocalization Applications Using GPS, NoSQL, and K3s Clusters 17. Chapter 14: Computer Vision with Python and K3s Clusters 18. Chapter 15: Designing Your Own Edge Computing System 19. Index 20. Other Books You May Enjoy

Edge data centers using K3s and basic edge computing concepts

With the evolution of the cloud, companies and organizations are starting to migrate their processing tasks to edge computing devices, with the goal to reduce costs and get more benefits for the infrastructure that they are paying for. As a part of the introductory content in this book, we must learn about the basic concepts related to edge computing and understand why we use K3s for edge computing. So, let’s get started with the basic concepts.

The edge and edge computing

According to the Qualcomm and Cisco companies, the edge can be defined as “anywhere where data is processed before it crosses the Wide Area Network (WAN)”; this is the edge, but what is edge computing? A post by Eric Hamilton from Cloudwards.net defines edge computing as “the processing and analyzing of data along a network edge, closest to the point of its collection, so that data becomes actionable.” In other words, edge computing refers to processing your data near to the source and distributing the computation in different places, using devices that are close to the source of data.

To add more context, let’s explore the next figure:

Figure 1.1 – Components of edge layers

Figure 1.1 – Components of edge layers

This figure shows how the data is processed in different contexts; these contexts are the following:

  • Cloud layer: In this layer, you can find the cloud providers, such as AWS, Azure, GCP, and more.
  • Near edge: In this layer, you can find telecommunications infrastructure and devices, such as 5G networks, radio virtual devices, and similar devices.
  • Far edge: In this layer, you will find edge clusters, such as K3s clusters or devices that exchange data between the cloud and edge layer, but this layer can be subdivided into the tiny edge layer.
  • Tiny edge: In this layer, you will find sensors, end-user devices that exchange data with a processing device, and edge clusters on the far edge.

Important Note

Remember that edge computing refers to data that is processed on edge devices before the result goes to its destination, which could be on a public or private cloud.

Other important concepts to consider for building edge clusters are the following:

  • Fog computing: An architecture of cloud services that distribute the system across near edge and far edge devices; these devices can be geographically dispersed.
  • Multi-Access Edge Computing (MEC): This distributes the computing at the edge of larger networks, with low latency and high bandwidth, and is the predecessor of mobile edge computing; in other words, the processing uses telecom networks and mobile devices.
  • Cloudlets: This is a small-scale cloud data center, which could be used for resource-intensive use cases, such as data analytics, Machine Learning (ML) and so on.

Benefits of edge computing

With this short explanation, let’s move on to understand the main benefits of edge computing; some of these include the following:

  • Reducing latency: Edge computing can process heavy compute processes on edge devices, reducing the latency to bring this information.
  • Reducing bandwidth: Edge computing can reduce the used bandwidth while taking part of the data on the edge devices, reducing the traffic on the network.
  • Reducing costs: Reducing latency and bandwidth translates to the reduction of operational costs, which is one of the most important benefits of edge computing.
  • Improving security: Edge computing uses data aggregation and data encryption algorithms to improve the security of data access.

Let’s now discuss containers, Docker, and containerd.

Containers, Docker, and containerd for edge computing

In the last few years, container adoption has been increasing because of the success of Docker. Docker has been the most popular container engine for the last few years. Container technology gives businesses a way to design applications using microservices architecture. This way, companies speed up their development and strategies for scaling their applications. So, to begin with a basic concept: A container is a small runtime environment that packages your application with all the dependencies needed for it to run. This concept is not new, but Docker, a container engine, popularized this concept. In simple words, Docker uses small operating system images with the necessary dependencies to run your software. This can be called operating system virtualization. What this does is use the cgroups kernel feature of Linux to limit CPU, memory, network, I/O, and so on for your processes. Other operating systems, such as Windows or FreeBSD, use similar features to insulate and create this type of virtualization. Let’s see the next figure to represent these concepts:

Figure 1.2 – Containerized applications inside the OS

Figure 1.2 – Containerized applications inside the OS

This figure shows that a container doesn’t depend on special features, such as a hypervisor that is commonly seen in hardware virtualization used by VMware, Hyper-V, and Xen; instead of that, the application runs as a binary inside the container and reuses the host kernel. Let’s say that running a container is almost like running a binary program inside a directory but adds some resource limits, using cgroups in the case of Linux containers.

Docker implements all these abstractions. It is a popular container toolchain that adds some versioning features, such as Git. That was the main reason it became very popular, and it features easy portability and versioning at the operating system level. At the moment, containerd is the container runtime used by Docker and Kubernetes to create containers. In general, with containerd, you can create containers without extra features; it’s very optimized. With the explosion of edge computing, containerd has become an important piece of software to run containers in low-resource environments.

In general, with all these technologies you can do the following:

  • Standardize how to package your software.
  • Bring portability to your software.
  • Maintain your software in an easier way.
  • Run applications in low-resource environments.

So, Docker must be taken into consideration as an important software piece to build edge computing and low-resource environments.

Distributed systems, edge computing, and Kubernetes

In the last decade, distributed systems evolved from multi-node clusters with applications using monolithic architectures to multi-node clusters with microservices architectures. One of the first options to start building microservices is to use containers, but once the system needs to scale, it is necessary to use an orchestrator. This is where Kubernetes comes into the game.

As an example, let’s imagine an orchestra with lots of musicians. You can find musicians playing the piano, trumpets, and so on. But if the orchestra was disorganized, what would you need to have to organize all the musicians? The answer is an orchestra director or an orchestrator. Here is when Kubernetes appears; each musician is a container that needs to communicate or listen to other musicians and, of course, follow the instructions of the orchestra director or orchestrator. In this way, all the musicians can play their instruments at the right time and can sound beautiful.

This is what Kubernetes does; it is an orchestrator of containers, but at the same time it is a platform with all the necessary prebuilt pieces to build your own distributed system, ready to scale and designed with best practices that can help you to implement agile development and a DevOps culture. Depending on your use case, sometimes it’s better to use something small such as Docker or containerd, but for complex or demanding scenarios, it’s better to use Kubernetes.

Edge clusters using K3s – a lightweight Kubernetes

Now, the big question is how to start building edge computing systems. Let’s get started with K3s. K3s is a Kubernetes-certified distribution created by Rancher Labs. K3s doesn’t include by default extra features that are not vital to be used on Kubernetes, but they can be added later. K3s uses containerd as its container engine, which gives K3s the ability to run on low-resource environments using ARM devices. For example, you can also run K3s on x86_64 devices in production environments. However, for the purpose of this book, we will use K3s as our main piece of software to build edge computing systems using ARM devices.

Talking about clusters at the edge, K3s offers the same power as Kubernetes but in a small package and in an optimized way, plus some features designed especially for edge computing systems. K3s is very easy to use, compared with other Kubernetes distributions. It’s a lightweight Kubernetes that can be used for edge computing, sandbox environments, or whatever you want, depending on the use case.

Edge devices using ARM processors and micro data centers

Now, it’s time to talk about edge devices and ARM processors, so let’s begin with edge devices. Edge devices are designed to process and analyze information near to the data source location; this is where the edge computing mindset comes from. Talking about low-energy consumption devices, x86 or Intel processors consume more energy and get warmer than ARM processors. This means more power and more cooling; in other words, you will pay more money for x86_64 processors. On the other hand, ARM processors have less computational power and consume less energy. That’s the reason for the success of ARM processors on smartphone devices; they give you better cost and benefit between processing and energy consumption compared to Intel processors.

Because of that, companies are interested in designing micro data centers using ARM processors in their servers. For the same reason, companies are starting to migrate their workloads to be processed by devices using ARM processors. One example is the AWS Graviton2, which is a service that offers cloud instances using ARM processors.

You have been reading a chapter from
Edge Computing Systems with Kubernetes
Published in: Oct 2022
Publisher: Packt
ISBN-13: 9781800568594
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime