Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Building Serverless Applications with Python

You're reading from   Building Serverless Applications with Python Develop fast, scalable, and cost-effective web applications that are always available

Arrow left icon
Product type Paperback
Published in Apr 2018
Publisher
ISBN-13 9781787288676
Length 272 pages
Edition 1st Edition
Languages
Concepts
Arrow right icon
Author (1):
Arrow left icon
Jalem Raj Rohit Jalem Raj Rohit
Author Profile Icon Jalem Raj Rohit
Jalem Raj Rohit
Arrow right icon
View More author details
Toc

Table of Contents (11) Chapters Close

Preface 1. The Serverless Paradigm FREE CHAPTER 2. Building a Serverless Application in AWS 3. Setting Up Serverless Architectures 4. Deploying Serverless APIs 5. Logging and Monitoring 6. Scaling Up Serverless Architectures 7. Security in AWS Lambda 8. Deploying a Lambda Function with SAM 9. Introduction to Microsoft Azure Functions 10. Other Books You May Enjoy

Understanding serverless architectures

The concept of serverless architectures or serverless engineering revolves entirely around understanding the concept of functions as a service. The most technical and accurate definition of serverless computing on the internet is as follows:

"Serverless computing, also known as function as a service (FAAS), is a cloud computing and code execution model in which the cloud provider fully manages starting and stopping of a function's container platform as a service (PaaS)."

Now, let's go into the details of each part of that definition to understand the paradigm of serverless computing better. We shall start with the term function as a service. It means that every serverless model has a function that is executed on the cloud. These functions are nothing but blocks of code, that are executed depending on the trigger that is associated with the function. This is a complete list of triggers in the AWS Lambda environment:

Now let's understand what manages the starting and stopping of a function. Whenever a function is triggered via one of these available triggers, the cloud provider launches a container in which the function executes. Also, after the function is successfully executed the function has returned something, or if the function has run out of time, the container gets thatched away or destroyed. The thatching happens so that the container can be reused in the event of high demand and whenever there is very little time between two triggers. Now, we come to the next part of the sentence, the function's container. This means that the functions are launched and executed in containers. This is the standard definition of a container from Docker, a company that made the concept of containers very popular:

"A container image is a lightweight, stand-alone, executable package of a piece of software that includes everything needed to run it: code, runtime, system tools, system libraries, settings."

This helps in packaging the code, the runtime environment, and so on of the function into a single deployment package for seamless execution. The deployment package contains the main code file for the function, all the non-standard libraries which are required for the function to execute. The creation process of a deployment package looks very similar to that of a virtual environment in Python.

So, we can clearly make out that there are no servers running round the clock in the case of serverless infrastructures. There is a clear benefit for this, which includes not having a dedicated Ops team member for monitoring the server boxes. So the extra member, if any, can focus on better things, such as software research, and so on. Not having servers running through the entire day saves a lot of money and resources for the company and/or personally. This benefit can be very clearly seen among machine learning and data engineering teams who make use of GPU instances for their regular workload. So having on-demand serverless GPU instances running, saves a lot of money without the developers or the Ops team needing to maintain them around the clock.

You have been reading a chapter from
Building Serverless Applications with Python
Published in: Apr 2018
Publisher:
ISBN-13: 9781787288676
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image