Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
IoT Edge Computing with MicroK8s

You're reading from   IoT Edge Computing with MicroK8s A hands-on approach to building, deploying, and distributing production-ready Kubernetes on IoT and Edge platforms

Arrow left icon
Product type Paperback
Published in Sep 2022
Publisher Packt
ISBN-13 9781803230634
Length 416 pages
Edition 1st Edition
Arrow right icon
Author (1):
Arrow left icon
Karthikeyan Shanmugam Karthikeyan Shanmugam
Author Profile Icon Karthikeyan Shanmugam
Karthikeyan Shanmugam
Arrow right icon
View More author details
Toc

Table of Contents (24) Chapters Close

Preface 1. Part 1: Foundations of Kubernetes and MicroK8s
2. Chapter 1: Getting Started with Kubernetes FREE CHAPTER 3. Chapter 2: Introducing MicroK8s 4. Part 2: Kubernetes as the Preferred Platform for IoT and Edge Computing
5. Chapter 3: Essentials of IoT and Edge Computing 6. Chapter 4: Handling the Kubernetes Platform for IoT and Edge Computing 7. Part 3: Running Applications on MicroK8s
8. Chapter 5: Creating and Implementing Updates on a Multi-Node Raspberry Pi Kubernetes Clusters 9. Chapter 6: Configuring Connectivity for Containers 10. Chapter 7: Setting Up MetalLB and Ingress for Load Balancing 11. Chapter 8: Monitoring the Health of Infrastructure and Applications 12. Chapter 9: Using Kubeflow to Run AI/MLOps Workloads 13. Chapter 10: Going Serverless with Knative and OpenFaaS Frameworks 14. Part 4: Deploying and Managing Applications on MicroK8s
15. Chapter 11: Managing Storage Replication with OpenEBS 16. Chapter 12: Implementing Service Mesh for Cross-Cutting Concerns 17. Chapter 13: Resisting Component Failure Using HA Clusters 18. Chapter 14: Hardware Virtualization for Securing Containers 19. Chapter 15: Implementing Strict Confinement for Isolated Containers 20. Chapter 16: Diving into the Future 21. Frequently Asked Questions About MicroK8s
22. Index 23. Other Books You May Enjoy

Scaling the application deployment

Changing the number of replicas in a Deployment allows scaling the deployments. When a Deployment is scaled out, new pods will be created and scheduled to Nodes with available resources. The number of pods will be scaled up to the new target state. Autoscaling pods is also supported by Kubernetes. It is also possible to scale to zero, which will terminate all pods in a given Deployment.

Running many instances of an application needs a method for distributing traffic among them. A built-in load balancer in a Services object distributes network traffic across all pods in an exposed Deployment. Endpoints will be used to continuously monitor the operating pods, ensuring that traffic is only directed to those that are available.

We'll use a new YAML file to increase the number of pods in the Deployment in the following example. The replicas are set to 4 in this YAML file, indicating that the Deployment should include four pods:

apiVersion...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime