Summary
In this chapter, we covered the hot topic of serverless computing. We explained the two meanings of serverless – eliminating the need to manage servers, and deploying and running FaaS. We explored in depth the aspects of serverless infrastructure in the cloud, especially in the context of Kubernetes. We compared the built-in cluster autoscaler as a Kubernetes-native serverless solution to the offerings of cloud providers such as AWS EKS+Fargate, Azure AKS+ACI, and Google Cloud Run. We switched gears and dove into the exciting and promising Knative project with its scale-to-zero capabilities and advanced deployment options. Then, we moved to the wild world of FaaS on Kubernetes. We mentioned the plethora of solutions out there and examined them in detail with hands-on experiments for some of the prominent solutions out there: Fission, Kubeless, and riff. The bottom line is that both flavors of serverless computing bring real benefits as far as operations and cost management...