Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Serverless Programming Cookbook
Serverless Programming Cookbook

Serverless Programming Cookbook: Practical solutions to building serverless applications using Java and AWS

eBook
$24.99 $35.99
Paperback
$48.99
Subscription
Free Trial
Renews at $19.99p/m

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning
Table of content icon View table of contents Preview book icon Preview Book

Serverless Programming Cookbook

Getting Started with Serverless Computing on AWS

This chapter will cover the following topics:

  • Getting started with the AWS platform
  • Your first AWS Lambda
  • Your first Lambda with AWS CLI
  • Your first Lambda with Amazon CloudFormation
  • Using AWS SDK, Amazon CloudFormation, and AWS CLI with Lambda
  • Dev practices: dependency injection and unit testing
  • Your first Lambda with Serverless framework

Introduction

Cloud computing introduced a pay-per-use model and abstracted physical servers with virtual machines and managed services. Cloud computing execution models include Infrastructure as a Service (IaaS), Platform as a service (PaaS), Software as a Service (SaaS), and Serverless computing (or Function as a Service (FaaS)).

IaaS provides services that form the basic building blocks of cloud computing, such as virtual machines, storage, network, and so on. PaaS provides platforms on which we can develop applications such as execution runtime, databases, web servers, and so on. Saas provides completed software that we can use for various needs such as Gmail's email service.

Serverless computing allows us to run functions (code) without worrying about servers and pay only for the time we execute code. Despite the name, servers are still present, however, the provider does all the server management including starting and stopping them to serve requests, patching and more. Serverless computing comes roughly in between PaaS and SaaS.

This book focuses on AWS cloud (except in the last chapter), but most concepts apply to any cloud provider. Within AWS recipes, we will specify the AWS CLI commands for most of the use cases. In addition, we will use Java for all use cases where we generally use AWS Lambda such as working with DynamoDB database, Kinesis streams, SQS and SNS, and building backend for an Alexa skill. For services that are generally integrated into the UI such as Cognito we will discuss JavaScript SDK code. For one-time activities such as account creation and domain registration, and monitoring, we will also discuss AWS Management console steps.

Getting started with the AWS platform

Amazon provides you with Free Tier to get started with AWS on production quality servers. Free Tier provides you with free access to many services and features with decent limits.

Free Tier policies may change anytime. So, to avoid accidental costs, do check the Free Tier policies regularly at https://aws.amazon.com/free.

Getting ready

To work with AWS Free Tier, you need a decent computer, a reasonable internet connection, a working credit card, and basic knowledge of computers and the internet.

How to do it...

Let's get started on the AWS platform by creating a Free Tier account. We will then do some basic IAM settings as suggested by AWS. Finally, we will also create a billing alarm to keep track of any unexpected costs. If you already have a working account with basic setup done, you may skip this part of the recipe:

  1. Go to https://aws.amazon.com and create a new Free Tier account (if you do not already have one) as follows:
    1. Provide login credentials.
    2. Provide personal information such as address, phone number, and other required details, if you have selected Personal account, or Corporate information if you have selected company account.
    3. Provide credit card details.
    4. Proceed with telephonic verification.
    5. Select Basic plan for Free Tier account with community support (or select a paid plan if you want to).

After logging in for the first time, it is recommended that you complete the basic Identity and Access Management (IAM) security settings listed under the Security Status heading. If you have previously logged in, the options might not be displayed as shown next. If so, you need to manually go to IAM service from the Services dropdown.

  1. Click on Activate Multi-Factor Authentication (MFA) on your root account and do as follows:
    1. Click Manage.
    2. Select A Virtual MFA Device.
    3. Click Continue on the message for installing an MFA-compatible application (assuming you have installed Google Authenticator along with barcode scanner, or any similar applications).
    4. Scan the barcode shown on screen using Google Authenticator, and enter two consecutive codes for confirmation.
  2. Click on Create individual IAM users and do as follows:
    1. Enter Username.
    2. Select Access Type (Programmatic access and AWS Management Console access).
    3. Download the credentials .csv file to a secure area in your local machine. You will not be able to download it later, but you can regenerate it.
  1. Click on Use groups to assign permissions and assign some random permissions.
  2. Click on Apply an IAM password policy to set up a basic password policy.
It is a good practice to assign permissions through groups even if there is only one user.

IAM dashboard should now show all security status items as green:

  1. Create a billing alarm to have a check on accidental costs:
    1. Go to My Billing Dashboard (by clicking the drop-down arrow near to your name).
    2. Under Alerts and Notifications, click on Enable Now to Monitor your estimated charges.
    3. After going to Preferences, select Receive Billing Alerts and click on Manage Billing Alerts link within the contents, which will take you to CloudWatch.
    4. Click on Billing and create an alarm.
You may also use the budgets feature to keep track of your costs. Read more at https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/budgets-managing-costs.html.

If you followed all previous steps successfully, you are ready to get started with further recipes in this book.

How it works...

Most of the steps in this recipe are self-explanatory and similar to registering for any other paid online service. The following are the important AWS services and concepts that were introduced in this recipe.

AWS Identity and Access Management (IAM)

IAM enables secure access to AWS resources. IAM supports standard security concepts such as users, groups, roles, and permissions. The user is an individual who wants to use AWS services. Users can be added to groups. Users and groups are assigned with permissions. Roles are used by a service (for example, Amazon Ec2) for accessing other services.

Amazon CloudWatch

Amazon CloudWatch is a service that helps in monitoring your applications, responding to changes (such as performance changes and billing alarms), optimizing resource utilization, and providing you a unified view of the health of services in your account. We will see more use cases of Amazon CloudWatch in later recipes.

Multi-Factor Authentication (MFA)

Multi-Factor Authentication provides additional levels of authentication. In addition to passwords, it also requires you to authenticate using a token generated by a virtual or physical authenticator. It is a good practice to set up MFA even for personal accounts, as the password is the same as the e-commerce portal and Prime Video.

There's more...

The following are some of the common AWS services that are used in building Serverless applications on the AWS:

  • AWS Lambda lets you write code without configuring any server.
  • Amazon API Gateway lets you create REST APIs without coding.
  • Amazon Simple Storage Service (S3) is an object store that helps you store and retrieve data. S3 can also be used for hosting single-page applications (SPA) such as an angular or react application.
  • Amazon DynamoDB is a scalable NoSQL database.
  • Amazon CloudFront is a Content Delivery Network (CDN) service.
  • Amazon CloudWatch is a service to monitor your applications and respond to changes.
  • AWS CloudFormation templates written in JSON or YAML can be used to provision and model our infrastructure.
  • AWS Identity and Access Management (IAM) provides access control for AWS resources.
  • Amazon Cognito helps you build access control for your application with features such as user sign-up, sign-in, and more.
  • Other services can be used alongside these services for advanced use cases, such as natural language processing (for example, Alexa Skills kit, and Lex), Analytics (Amazon Kinesis Streams), Machine Learning (Amazon Machine Learning), and so on.

Apart from using the AWS management console from a browser, we can also interact with AWS services from AWS CLI (command line) and AWS SDK (programmatic access). Except for the first few recipes, we will mostly focus on using Amazon CloudWatch with AWS CLI for modeling and provisioning our infrastructure.

See also

Your first AWS Lambda

AWS Lambda is the core service in AWS for building serverless applications. You can run code without provisioning servers. You pay only for the time you run your code, unlike EC2 where you pay for the time the server is up. Lambda also takes care of high availability. You can invoke Lambdas from other AWS services, console, or AWS CLI.

In this recipe, we will create a Lambda in Java and deploy it using the AWS management console. In the next recipe, we will also explore AWS CLI to deploy Lambda. In later recipes and chapters, we will see how we can automate most of the deployment tasks using Amazon CloudWatch templates similar to how most enterprise projects do.

Getting ready

To follow the example in this recipe, you need a working AWS account. You should also set up Java Development Kit (JDK) and Maven in your local machine. I am currently using Java 8 and Maven 3.5.4.

Example projects in this book uses a maven parent project, serverless-cookbook-parent-aws-java.The versions of libraries used within each Lambda project (for example, aws.sdk.version) are defined in the parent project POM file.

If you want to extend any recipe for your particular use case without needing to have the parent project, you can easily get rid of the parent project by moving the required properties and dependencies into the individual projects.

It is a good idea to create a folder within your operating system to manage the code files for this book. I will use a folder with the name serverless. You need to make sure that you can execute the following commands from this folder:

javac -version 
mvn -version

You can set up the parent project inside our parent folder (serverless in my case) by executing the following commands from the command line:

  1. Clone our book's Github repository:
git clone https://github.com/PacktPublishing/Serverless-Programming-Cookbook.git
  1. Go inside the repository folder, go inside our project-specific parent project, and run mvn clean install:
cd Serverless-Programming-Cookbook
cd serverless-cookbook-parent-aws-java
mvn clean install
The code repository of this book already has working code for all the recipes, where applicable. You may also create another folder within the parent folder (serverless is the parent folder in my case) to practice the examples within this book, and look into the code repository files only, when in doubt.

Code repository usage guidelines

Each chapter has a directory of its own (for example, Chapter 01). Inside the chapter's directory there will be sub-directories for each recipe. The recipe specific directory has names corresponding to the recipe's title. For example, the directory for this chapter, recipe titled Your first Lambda is your-first-lambda.

Inside the recipe's directory, there will be a directory for storing all resources including the AWS CLI commands called resources. Long AWS CLI commands are split into multiple lines for better readability using the \ symbol. If you are using a Windows machine you can use the ^ symbol instead of the \ symbol in the code files or make a single line command without the \ symbol.

The recipe's directory also contains a sub-directory for each Lambda project. You need to run mvn clean package for generating the Lambda JAR from within this directory. The Lambda JAR is generated within the target directory inside this directory. Every Lambda project inherits from the common Lambda parent project's directory serverless-cookbook-parent-aws-java and hence needs to be built before any Lambda project, following the steps outlined in the previous section.

Code examples within the book follows the AWS documentation style and is tested primarily on Mac operating system. It should also work on most Unix based operating systems such as Linux. For alternative solutions you may refer to the code files repository. Please refer to the heading Alternative Solutions in the repository's readme file for more details.
Various user specific parameter values such as IDs, AWS account numbers, generated JAR file names etc. given within the examples has to be replaced with valid values based on previous steps executed and your account specific details. Copy pasting and executing the commands without verifying and replacing such parameter values can result in error.

How to do it...

We will create our first Lambda with Java as a Maven project. The javadoc comments and package-info.java files required for checkstyle checks from the parent are not shown here. We are also making use of the Maven shade plugin from the parent for generating the JAR files. You may refer to the code files for each recipe for the complete code:

  1. Create the Java project based on Maven.

Create a Java project based on Maven with our common parent, declared as shown next in the POM file:

You may use an IDE such as Intellij IDEA or Eclipse for working with the examples in this book.
<groupId>tech.heartin.books.serverless-cookbook</groupId>
<artifactId>helloworld-lambda</artifactId>
<version>0.0.1-SNAPSHOT</version>
<parent>
<groupId>tech.heartin.books.serverlesscookbook</groupId>
<artifactId>serverless-cookbook-parent-aws-java</artifactId>
<version>0.0.1-SNAPSHOT</version>
</parent>
  1. Also, declare the only dependency we need for our hello world lambda project in the POM file:
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>${aws.lambda.java.core.version}</version
</dependency>
</dependencies>

The dependency versions (for example, aws.lambda.java.core.version) are defined in the POM file for the parent project serverless-cookbook-parent-aws-java.

  1. Create the Lambda handler class and package it as a JAR.

Create a class, HelloWorldLambdaHandler, that implements the interface, RequestHandler:

package tech.heartin.books.serverlesscookbook;

import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;

public final class HelloWorldLambdaHandler implements RequestHandler<String, String> {
public String handleRequest(final String s, final Context context) {
context.getLogger().log("input: " + s + "\n");
String greeting = "Hello " + s;
return greeting;
}
}

To package the Lambda as a JAR file, from the project root folder, run the following:

mvn clean package

Two JARS will be created: one with only class files (starting with original-) and an Uber JAR with dependencies (starting with serverless-). You can easily differentiate between one and the other looking at their sizes. We will use the JAR file that starts with original- and that has only the class files for this recipe.

  1. Deploy the Lambda handler to the AWS:
    1. Log in to the AWS console, and go to Lambda dashboard by clicking on Services and searching or selecting Lambda. Currently, it is under the compute category.
    2. Create a Lambda function as follows:
      1. Click on Create Function.
      2. Select Author From Scratch, which is the default.
      3. Give a name, such as myHelloWorldLambda.
      4. Select Java 8 as the runtime.
      5. Under Role, select Create new role from one or more templates.
      6. Give a role name, such as myHelloWorldLambda.
      7. Leave the field for specifying Policy templates blank.
      8. Click on Create Function. You should see a success message after a while.
  2. Upload the Lambda JAR:

Go to the Function code section and do the following:

    1. Select Code entry type as Upload a .zip or .jar file.
    2. Select Java 8 as the runtime.
    3. Specify the fully qualified class name with handler method name as the following: tech.heartin.books.serverlesscookbook.HelloWorldLambdaHandler::handleRequest.
    4. Click on Upload under Function package and select the JAR file. You can select the JAR whose name starts with original-.
    5. Click on Save to save with defaults for other fields.
  1. We can test the uploaded JAR:
    1. Select Configure test events from the Select a test event dropdown next to the Test button.
    2. Select Create new test event.
    3. Give a name for the event: MyHelloWorldTest.
    4. Within the JSON request content area, just specify your name, such as Heartin.
    5. Click on Create. If successful, it will take you to the myHelloWorldLambda function page.
    6. From the myHelloWorldLambda function page, select the test event, MyHelloWorldTest, next to the Test button, and click the Test button.
    7. You should see the message Hello Heartin after expanding the details of execution result.
  1. We can also check the logs printed using context.getLogger().log():
    1. Under the Log output section, you can see the log you printed.
    2. You can also see the log in the CloudWatch service. There should be a Click here link to view the CloudWatch log group. Click on the link, wait or refresh for a stream that matches your invocation time, and click on the stream link to see the log statement within CloudWatch.

How it works...

The following are the in detail information about the role and functionality of Lambda plays and concepts that were introduced in this recipe.

About the parent POM

Example projects in this book use the Maven parent project serverless-cookbook-parent-aws-java that defines the dependency versions for our examples. The actual dependencies are defined within each example project to help you understand the dependencies needed for each use case. All dependency definitions are shown within comments in the parent POM for quick reference.

Our parent project serverless-cookbook-parent-aws-java is also dependent on two open source projects: simple-starter-parent-java for the common Java dependencies, and simple-starter-build-tools for the common build file, such as the code style plugin definitions.

Lambda roles

In this recipe, we selected the Create new role from template(s) and did not select any policy. The basic permissions required (logging to CloudWatch) are added by default. We can also choose an existing role or create a custom role.

Lambda runtimes

AWS Lambda supports various runtimes, such as C# (.NET Core 1.0), C# (.NET Core 2.0), C# (.NET Core 2.1), Go 1.x, Java 8, Node.js 4.3, Node.js 6.10, Node.js 8.10, Python 2.7, and Python 3.6. Inline code editing is only allowed for Node.js and Python.

Extra dependencies

Our parent project, serverless-cookbook-parent-aws-java, defines a few more dependencies than I have. You can download them automatically through Maven (these projects are already available in Maven Central) or set these up manually in your local machine (to examine or modify) by executing the following commands from the command line.

  1. Go inside the parent folder (serverless in my case) and clone the simple-starter-build-tools project:
git clone https://github.com/heartin/simple-starter-build-tools.git
  1. Go inside the project folder and run mvn clean install, as follows:
cd simple-starter-build-tools
mvn clean install
  1. Go back to the parent folder (serverless in my case) and clone the simple-starter-parent-java project:
git clone https://github.com/heartin/simple-starter-parent-java.git
  1. Go inside the project folder and run mvn clean install:
cd simple-starter-parent-java
mvn clean install
For more details on the preceding project dependencies, refer to the respective Readme files.

There's more...

The following are the in detail information about the other ways to create Lambda and to deploy its functions:

Other ways to create Lambda functions from the management console

Apart from the Author from scratch option, we can create Lambdas using Blueprints and Serverless Application Repository. Blueprints allow you to choose a preconfigured template as a starting point. Currently, blueprints are available only for Node.js and Python. Serverless Application Repository allows you to find and deploy Serverless apps developed by developers, companies and partners on AWS.

Other ways to deploy code in a Lambda function

In this recipe, we developed our code outside AWS and uploaded it to our AWS Lambda function as a JAR file. You can also upload the file to Amazon S3 by selecting Code entry type as Upload a file from Amazon S3, and providing the S3 link. For some languages such as Node.js and Python, you can also write the code inline within the Lambda function.

Passing JSON to and from Lambda handler

In this recipe, we passed simple Strings to and from our Lambda handler. We can instead pass a JSON and get back a JSON. To do this, we need to create two POJOs that represent our input and output, and specify them as generic types within our Handler declaration. We will see this approach in the next recipe.

See also

Your first Lambda with AWS CLI

The AWS Command Line Interface (CLI) is a command line tool provided by AWS to manage AWS services. You can save your credentials and config into profiles, and then specify a profile while executing a command. The more you get familiar with the CLI commands, the faster you can work with AWS services, making you more productive.

In this recipe, we will deploy an AWS Lambda using AWS CLI. We will use an updated hello world. In the last recipe, we had sent and received back simple text. In this recipe, we will demonstrate the use of POJOs for sending to and retrieving JSON data from the Lambda handler.

In most of the later recipes within this book, I will be including AWS CLI commands along with either Management Console or CloudFormation steps to provide an overview of various API usages in a programming language-independent way. You can follow these API usages along with any particular programming language SDK documentation to implement it in that language. The CLI commands also help us better understand the CloudFormation templates.

Getting ready

Following are the prerequisites for this recipe:

  1. Install and configure JDK, Maven and the parent project, serverless-cookbook-parent-aws-java, and read the section as outlined in
  2. Follow the Getting ready section of the recipe Your first AWS Lambda to install and configure JDK, Maven and the parent project, serverless-cookbook-parent-aws-java, and follow the notes given in that section for code usage guidelines
  3. Configure AWS CLI as given later in this section
  4. Create an S3 bucket

Configuring AWS CLI

We can use pip or pip3 to install AWS CLI.

In a Windows machine, you can also install AWS CLI using the MSI installer following the steps at https://docs.aws.amazon.com/cli/latest/userguide/awscli-install-windows.html#install-msi-on-windows.

You use pip or pip3 to install AWS CLI as:

pip install awscli --upgrade --user

Pip is a Python package management tool that can be installed along with Python. You may replace pip with pip3 if you have installed pip3. The --upgrade option upgrades any installed requirements. The --user option installs the program to a sub-directory of your user directory to avoid modifying libraries used by operating system.

The ids or keys shown within the examples in this book should be replaced with your own ids wherever applicable. Simply copy pasting the commands will not work in such cases.

We can configure our AWS credentials in our local machine by running aws configure. This will setup a default AWS profile. You can have more named profiles if you want.

It is recommended that you create the default profile with credentials of a user with basic permissions. You can then create additional profiles for other use cases. We will be creating a user profile called admin later within this section for a user with admin permissions.

Run the below command to configure AWS CLI for the default profile. If aws command is not recognized, you will need to add it to the path.

aws configure

Provide your AWS Access Key ID, AWS Secret Access Key, Default region name, and Default output format:

AWS Access Key ID and AWS Secret Access Key is generated by AWS when you create a user with programmatic access. We had created an user and generated these credentials in the recipe Getting started with the AWS platform. You can also regenerate them later if you forget or miss them following the below steps:

  1. Log in to AWS.
  2. Go to IAM service.
  3. Click on Users from the sidebar. This will show you the user summary page.
  4. From within the user summary page, click on Security Credentials tab.
  5. Click on Create access key to create a new key. You may make the old key inactive or delete it.

The AWS Access Key ID and AWS Secret Access Key entered is stored in a file, ~/.aws/credentials, and the region name and output format is stored in a file, ~/.aws/config.

If you are using a Windows machine please refer to the sub heading Note for Windows users at the end of this section.

Verify the configuration as given as follows:

cat ~/.aws/credentials 

And next, run cat ~/.aws/config:

AWS documentation recommends creating a named profile for your admin user (for instance, a user with administrator access policy) and then using it with AWS CLI. You can add an additional profile in ~/.aws/credentials, as shown here:

You can add an additional profile by editing the file ~/.aws/config, as shown here:

Creating S3 bucket

We will be using Amazon Simple Storage Service (S3) to upload our JAR files. Therefore it would be good to do some reading on basic S3 concepts, such as S3 buckets and S3 keys.

You can create a bucket using the below command:

aws s3 mb s3://<bucket name> --profile admin

Replace the <bucket name> with your bucket's name. Remember that the S3 bucket name has to be unique across AWS.

Note for Windows users

If you are using a Windows machine the .aws folder should be present inside your user profile folder and may be found as dir %UserProfile%\.aws. You may also use the notepad command to edit files in a notepad instead of the cat command. Remember to save the notepad file if you are editing:

CLI commands that feature in this book should work on the terminals of a UNIX-style operating system, such as Linux or Mac, without any or many changes. Minor modifications may be needed to execute them in other platforms. For example, specifying multi-line commands using \ has to be replaced with ^ for the Windows OS command prompt, and ` for PowerShell.

How to do it...

We will create our Lambda, similar to in the Your First AWS Lambda recipe, but using POJOs for input and output. We will not go deep into concepts discussed previously. If in doubt, please refer to the Your First AWS Lambda recipe.

  1. Create the Maven project with only the core dependency, aws-lambda-java-core:
<groupId>tech.heartin.books.serverless-cookbook</groupId>
<artifactId>lambda-handler-with-pojos</artifactId>
<version>0.0.1-SNAPSHOT</version>

<parent>
<groupId>tech.heartin.books.serverlesscookbook</groupId>
<artifactId>serverless-cookbook-parent-aws-java</artifactId>
<version>0.0.1-SNAPSHOT</version>
</parent>

<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>${aws.lambda.java.core.version}</version>
</dependency>
</dependencies>
  1. Create POJO for input:
import lombok.Data;

@Data
public class HandlerRequest {
private String name;
}
  1. Create POJO for output:
import lombok.AllArgsConstructor;
import lombok.Data;

@Data
@AllArgsConstructor
public class HandlerResponse {
private String message;
}
I have used project lombok within the POJOs to autogenerate setters, getters, and all-arguments constructor. The lombok dependencies are defined in the parent project, simple-starter-parent-java.
  1. Create a Lambda handler with input and output POJOs:
public final class MyLambdaHandler implements RequestHandler<HandlerRequest, HandlerResponse> {
public HandlerResponse handleRequest(final HandlerRequest request,
final Context context) {
context.getLogger().log("Hello " + request.getName());
return new HandlerResponse("Hello " + request.getName());
}
}
  1. Package the JAR.

We can generate JARs by running mvn clean package. Two JARs are created: one with only class files (starting with original-) and an Uber JAR with dependencies (starting with serverless-). In this recipe, we will use the original JAR.

  1. Upload the JAR file to your S3 bucket using AWS CLI:
aws s3 cp target/original-serverless-cookbook-lambda-handler-with-pojos-0.0.1-SNAPSHOT.jar s3://serverless-cookbook/lambda-handler-with-pojos-0.0.1-SNAPSHOT.jar --profile admin
Replace the bucket name serverless-cookbook with your bucket's name. We saw the steps to create a bucket in the Getting ready section. Also, --profile admin is the profile we created in the Getting ready section.
  1. Create a policy with the aws iam create-policy command:
aws iam create-policy \
--policy-name lambda_iam_policy_test \
--policy-document file://basic-lambda-permissions.txt \
--profile admin

Replace <account_id> with your account id. You can get your account number by going to the My Account page after clicking on your name on the top right of your AWS management console. The policy file is also available in the resources folder of the recipe. If successful, you should get a response with the ARN of the policy created.

You may create a more restricting policy after checking the basic Lambda permissions template at https://docs.aws.amazon.com/lambda/latest/dg/policy-templates.html.
  1. Create a role using the aws iam create-role command:
aws iam create-role \
--role-name lambda_iam_role_test \
--assume-role-policy-document file://iam-role-trust-relationship.txt \
--profile admin

The policy file is available in the resources folder of the recipe. If successful, you should get a response with the arn of the role created.

Trust relationship policies allow the Lambda service to assume this role whereas the standard policy document is attached to a role to allow or deny access to resources.
  1. Attach the policy to the role:
aws iam attach-role-policy \
--role-name lambda_iam_role_test \
--policy-arn arn:aws:iam::<account_id>:policy/lambda_iam_policy_test \
--profile admin

Replace <account_id> with your account number.

  1. Create a Lambda function providing the role and the S3 location:
aws lambda create-function \
--function-name demo-lambda-with-cli \
--runtime java8 \
--role arn:aws:iam::<account_id>:role/lambda_iam_role_test \
--handler tech.heartin.books.serverlesscookbook.MyLambdaHandler::handleRequest \
--code S3Bucket=serverless-cookbook,S3Key=lambda-handler-with-pojos-0.0.1-SNAPSHOT.jar \
--timeout 15 \
--memory-size 512 \
--profile admin

Replace <account_id> with your account number. The code option can accept the shorthand form as used here, or a JSON.

  1. Invoke our Lambda from CLI:
aws lambda invoke \
--invocation-type RequestResponse \
--function-name demo-lambda-with-cli \
--log-type Tail \
--payload '{"name":"Heartin"}' \
--profile admin \
outputfile.txt

In certain platforms, you might have to add escaping for the payload specified in the command line. This is not required as the payload is specified as a file, as here:

--payload file://input.txt \

The output can be viewed in the outputfile.txt file:

  1. Note the following regarding cleanup roles, policy, and Lambda.

To delete Lambda, perform the following:

aws lambda delete-function \
--function-name demo-lambda-with-cli \
--profile admin

To detach policy from the role, perform the following:

aws iam detach-role-policy \
--role-name lambda_iam_role_test \
--policy-arn arn:aws:iam::<account_id>:policy/lambda_iam_policy_test \
--profile admin

Replace <account_id> with your account number.

To delete a role, note the following:

aws iam delete-role \
--role-name lambda_iam_role_test \
--profile admin

To delete policy, perform the following:

aws iam delete-policy \
--policy-arn arn:aws:iam::<account_id>:policy/lambda_iam_policy_test \
--profile admin

Replace <account_id> with your account number.

How it works...

The following are the important details and concepts that were introduced in this recipe:

Creating a role and attaching a policy

You need to create a role with a trust policy that allows our Lambda to assume the role. You also need to attach a policy that has CloudWatch permissions for logging.

Lambda memory-size and timeout

When creating a function from CLI, the default value of timeout is 3 seconds, and default value memory-size is 128 MB, which may not be sufficient for Lambdas with Uber JARs, and you may get a timeout exception or Process exited before completing request. Hence, I have set a higher timeout and memory-size. Other parameters are mostly self-explanatory.

S3 Bucket and Key

Amazon S3 is an object store. Objects (files) are stored as simple key-value pairs within containers called buckets. Bucket names have to be unique across AWS. There is no folder hierarchy within the buckets like traditional file systems. However, we can simulate folder structure with hierarchical key names. For example, consider the folder1/folder2/file.txt key, that simulates a folder-like structure. Read more about simulating folders in S3 at https://docs.aws.amazon.com/AmazonS3/latest/user-guide/using-folders.html.

Cleaning up

You need to do a cleanup in the following order:

  1. Delete Lambda that uses the role
  2. Detach policy from role
  3. Delete role and policy
We cannot delete a role without detaching all policies. We can however delete a role without deleting the Lambda. If you try to invoke the Lambda before attaching another role, it will give you an error such as—The role defined for the function cannot be assumed by Lambda.

There's more...

Once you get familiar with the AWS CLI commands, it is much faster and easier to work with AWS CLI, rather than navigate through the pages of AWS management console. This chapter covers only a very basic use case. Please follow the links in the See also section and try out more examples with AWS CLI and Lambda.

See also

Your first Lambda with Amazon CloudFormation

Amazon CloudFormation lets you provision and model your AWS service infrastructure declaratively. Instead of using interactive tools such as management console or CLI directly, you can declare the configuration with expected order, dependencies, input, and output in a template, and CloudFormation will provision it for you.

The concept of writing code to manage infrastructure is referred to as Infrastructure as Code (IaC) and is a practice that most enterprise companies follow. You can also maintain the provisioning code in a code repository and follow practices such as code reviews like any other code. Thus, it lets you reuse the provisioning code.

In this recipe, we will use CloudFormation to provision the infrastructure for the Lambda we created in the Your Lambda with AWS CLI recipe.

Getting ready

You need to read and follow to the Getting ready section of the recipes Your first AWS Lambda and Your first Lambda with AWS CLI before proceeding.

Set up the project and S3 bucket

In this recipe, we are reusing the Lambda we created in the Your Lambda with AWS CLI recipe. Generate a JAR by running mvn clean package inside that project, and upload the JAR to S3:

aws s3 cp target/original-serverless-cookbook-lambda-handler-with-pojos-0.0.1-SNAPSHOT.jar s3://serverless-cookbook/lambda-handler-with-pojos-0.0.1-SNAPSHOT.jar --profile admin

Replace the bucket name serverless-cookbook with your bucket's name. Refer to the Getting ready section of the recipe Your First AWS CLI to create the S3 bucket.

Understanding YAML and JSON

CloudFormation templates are written in JSON or YAML. Both support data in key-value pairs, objects, arrays, and so on .YAML also supports additional features such as multi-line strings, comments, and so on. I will also be using YAML for the examples. Since YAML support was introduced later for CloudFormation, you will also see a lot of JSON templates in the web. So, it is also good to have a decent understanding of YAML and JSON. If you are familiar with one, you may also use one of the JSON to YAML or YAML to JSON converters available online.

How to do it...

  1. Create the CloudFormation template.

Resources components specify the AWS resources used. We need two resources for our use case: a role and a Lambda function with that role. The following is the basic structure of our CloudFormation template:

---
AWSTemplateFormatVersion: '2010-09-09'
Description: Building Lambda with AWS CloudFormation
Resources:
IamRoleLambdaExecution:
Type: AWS::IAM::Role
Properties:
# Properties for the role are shown later.
LambdaFunctionWithCF:
Type: AWS::Lambda::Function
Properties:
# Properties for the Lambda are shown later.
DependsOn:
- IamRoleLambdaExecution

I have also defined AWSTemplateFormatVersion and Description as a general practice, but they are optional. Note that properties for the IamRoleLambdaExecution and LambdaFunctionWithCF are not shown here. You may refer to further steps or use the template from the code files.

The role needs a trust relationship policy that allows the lambda to assume that role, and we need to attach a policy to the role that provides CloudWatch logging permissions. The AssumeRolePolicyDocument property specifies the trust relationship policy for the role:

AssumeRolePolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Principal:
Service:
- lambda.amazonaws.com
Action:
- sts:AssumeRole

The policy is specified inline within the Policies property of the role:

Policies:
- PolicyName: 'lambda-with-cf-policy'
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- logs:CreateLogGroup
- logs:CreateLogStream
- logs:PutLogEvents
Resource: arn:aws:logs:*:*:*

We will also define two more properties for the role, namely path and name:

Path: "/"
RoleName: "lambda-with-cf-role"

Our Lambda function will have the following basic configuration:

LambdaFunctionWithCF:
Type: AWS::Lambda::Function
Properties:
Code:
S3Bucket: 'serverless-cookbook'
S3Key: lambda-handler-with-pojos-0.0.1-SNAPSHOT.jar
FunctionName: first-lambda-with-cloud-formation
Handler: tech.heartin.books.serverlesscookbook.MyLambdaHandler::handleRequest
MemorySize: 512
Role:
Fn::GetAtt:
- IamRoleLambdaExecution
- Arn
Runtime: java8
Timeout: 15
DependsOn:
- IamRoleLambdaExecution

We specify the role as a dependency for the Lambda function, and use Fn::GetAtt to retrieve the role dynamically instead of hardcoding the name. Most of the other properties are self-explanatory.

A CloudFormation stack is a collection of AWS resources that you need to manage as a single unit. All the resources in a stack are defined by a CloudFormation template. When you delete the stack, all of its related resources are also deleted.

We can create a CloudFormation stack in different ways, including the following:

    1. Going through the Create Stack option within the CloudFormation service inside AWS Management Console
    2. Uploading directly from the Template Designer within the CloudFormation service inside AWS Management Console
    3. AWS CLI

In this recipe, I will use Designer, but in all other recipes I will be using AWS CLI. AWS CLI is the best way to deploy CloudFormation templates. Designer is also a good tool to visualize and validate your scripts.

  1. Create CloudFormation stack from Designer:
    1. Log in to AWS and go to CloudFormation service.
    2. Click on the Design template button to go to Designer. Within designer, you may do the following:
    3. Choose template language as YAML in the editor. (If you are using a JSON template, use JSON instead.)
    4. Select the Template tab in the editor.
    5. Copy and paste your template into the template editor window.
    6. Click on refresh on the Designer to see the template in the Design view.
    7. If any changes are required, you can either make changes within the Template tab or use the Components tab.
    8. If everything looks good, click on the upload button on the top left of the designer to launch the Stack creation wizard with the current template.
    9. Follow the wizard with defaults, and select the checkbox for I acknowledge that AWS CloudFormation might create IAM resources with custom names. Finally, click on Create Stack.
    10. Invoke our Lambda with AWS CLI as follows and verify:
aws lambda invoke \
--invocation-type RequestResponse \
--function-name first-lambda-with-cloud-formation \
--log-type Tail \
--payload '{"name":"Heartin"}' \
--profile admin \
outputfile.txt

Output can be viewed in the outputfile.txt file:

Cleaning up roles, policy, and Lambda

To clean up resources created by CloudFormation, you just need to delete the stack. This is the default setting. Since we have used AWS management console for stack creation, we will use it for deletion as well.

You can delete a CloudFormation stack from the management console as follows: go to CloudFormation service, select the stack, click on Actions, and click Delete Stack.

How it works...

In this recipe, we used the following CloudFormation template components: Resource, AWSTemplateFormatVersion, and Description. Resources are the AWS resources used in the template. AWSTemplateFormatVersion is the version of CloudFormation template the template conforms to.

The only mandatory section in a CloudFormation template is Resource. However, it is a good practice to always define a version and a description for a template.

We used two resources: a role (IAMRoleLambdaExecution) and a Lambda function (LambdaFunctionWithCF) that depends on that role. Resource names can be anything. Type specifies the type of the resource. We used two types, namely AWS::IAM::Role and AWS::Lambda::Function.

The properties of the AWS::IAM::Role resource type that we used are as follows:

  • AssumeRolePolicyDocument specifies the trust relationship policy for the role
  • Policies specify the policies inline

The properties of the AWS::Lambda::Function resource type that we used are as follows:

  • Code property specifies the S3 bucket and the key. You can also specify a reference to an S3 Bucket resource type so that a new bucket is created dynamically and its name is used here.
  • FunctionName specifies the name of the Lambda function.
  • Handler specifies the fully qualified name of the handler class with the handler method.
  • MemorySize specifies the memory in MB. The number of CPU cores is decided by AWS based on the memory.
  • Role specifies the role.
  • Runtime specifies the runtime (for instance, java8).
  • TimeOut specifies the timeout.

To get the role Arn, we used the GetAtt function passing the logical name of the Role and the property name Arn:

Fn::GetAtt is an intrinsic function that returns the value of an attribute from a resource in the template.

We used CloudFormation designer in the recipe to see our template in design view, and then uploaded the template into a stack from the designer. You can also use the Designer to design CloudFormation templates from scratch.

There's more...

You can check the documentation and study related components within Lambda code if interested:

CloudFormation Template Components

CloudFormation templates are composed of the following primary components:

  • AWSTemplateFormatVersion is the version of CloudFormation template the template conforms to
  • Description is a text that describes the template
  • Resource components are the AWS resources used in the template
  • Parameter components are the input (dynamic) to your template
  • Mapping components are variables (static) for your template
  • Output components describe the values that are returned
  • Condition components control resource provisioning
  • Metadata provides additional information about the template
  • Transform specifies the version of the AWS Serverless Application Model (AWS SAM) for Serverless applications
Resource is the only mandatory section of a CloudFormation template.

We will talk about the components in the recipe in which they are introduced. Read more about template components at https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/template-anatomy.html.

Resource component

The following are some of the important features of Resource component:

  • Resource component of the template specifies the AWS resources used in the template
  • Resource can reference each other using the ref element
  • Resource names can be anything
  • Type specifies the type of the resource
  • Each type has its own set of properties that you can refer to from the documentation, given under the properties element
  • DependsOn specifies the other resources that the current resource is dependent on

Intrinsic functions

Intrinsic functions are built-in functions provided by AWS to use within a template for dynamically adding values. Common intrinsic functions used within CloudFormation templates are as follows: Fn::Base64, Fn::Cidr, Fn::FindInMap, Fn::GetAtt, Fn::GetAZs, Fn::ImportValue, Fn::Join, Fn::Select, Fn::Split, Fn::Sub, and Ref.

CloudFormation also supports the following conditional functions: Fn::And, Fn::Equals, Fn::If, Fn::Not, and Fn::Or.

We can specify the functions in the standard forms as mentioned here or in the short-hand form (for instance, !Base64, !Cidr, !Ref, and so on) if you are using YAML. We used the standard syntax for this recipe for reference, but will use the short-hand syntax in later recipes.

We will discuss the functions introduced in each chapter. You can read more about all intrinsic functions at https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/intrinsic-function-reference.html.

CloudFormation Designer

The following are some of the important features of the CloudFormation Designer:

  • Create templates from scratch visually, validate, and upload
  • Copy existing templates, see them visually, validate, and upload
  • Drag and drop the resources you need
  • Define relationships between resources
  • Right-click on the service and click on the appropriate context menu option to go directly to the CloudFormation documentation for that service
  • Edit logical name and other properties in the auto-generated template
  • Copy and paste an existing template and see it in design view
  • Directly upload the script to S3 and launch Create stack wizard in a single click

Additional benefits of CloudFormation

Apart from automated provisioning of resources through code and enabling reuse, CloudFormation also has other important usages, including the following:

  • Lets you estimate costs based on the templates
  • Enables tracking costs effectively
  • Helps in saving costs by automated deletion of resources when not needed
  • Diagrams generated based on templates can help in understanding the system better, and can be used in design discussions

Cloud Formation alternatives

Important alternatives to using CloudFormation include Ansible, Terraform, Chef, AWS OpsWorks, and AWS Elastic Beanstalk.

See also

Using AWS SDK, Amazon CloudFormation, and AWS CLI with Lambda

AWS SDK allows you to write code that interacts with AWS services. In this recipe, we will use AWS Java SDK for IAM to do some basic IAM operations to form a Lambda programmatically. We will use it along with Amazon CloudWatch and AWS CLI, which is a general practice followed in most real-world projects.

The aim of this recipe is to understand the use of AWS Java SDK inside Lambda. Therefore, we will not go deep into the details of the IAM operations discussed in the recipe. The IAM operations details are available at https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/examples-iam-users.html.

Getting ready

You need an active AWS account, and read and follow the Getting started section of the recipes, Your first AWS Lambda and Your first Lambda with AWS CLI to set up Java, Maven, the parent project, serverless-cookbook-parent-aws-java, and AWS CLI, and other code usage guidelines.

How to do it...

We will create a Java Maven project and set the parent as serverless-cookbook-parent-aws-java.

  1. Create a Java Maven project and set dependencies:
<parent>
<groupId>tech.heartin.books.serverlesscookbook</groupId>
<artifactId>serverless-cookbook-parent-aws-java</artifactId>
<version>0.0.1-SNAPSHOT</version>
</parent>
  1. Specify dependencies in the POM file:
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>${aws.lambda.java.core.version}</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-iam</artifactId>
<version>${aws.sdk.version}</version>
</dependency>

</dependencies>
Do not directly define the whole AWS Java SDK (aws-java-sdk) dependency for a Lambda handler. Instead, only declare the dependencies you need (such as aws-java-sdk-iam). I tried adding aws-java-sdk to our Lambda and generated the Uber JAR. It was around 93 MB. AWS console did not allow me to upload the file manually into the Lambda function as the limit was 50MB. So, I uploaded it to S3. However, it failed again while extracting the JAR as the size of the extracted contents exceeded the allowed size of 262144000 bytes.

Creating the POJOs for requests and response.

  1. Create a request POJO for accepting requests:
import lombok.Data;

@Data
public class IAMOperationRequest {
private String operation;
private String userName;
}
  1. Create a POJO for sending back the response from the handler:
import lombok.AllArgsConstructor;
import lombok.Data;

@AllArgsConstructor
@Data
public class IAMOperationResponse {
private String message;
private String errorMessage;
}
For our POJOs, we use project lombok (@Data) to auto-generate getters, setters, and so on. Project lombok dependency is added to the parent project simple-starter-parent-java. If you are using an IDE for development, you will have to install a plugin for your IDE to recognize project lombok annotations.

Creating a service class to implement the IAM Operations using AWS SDK:

  1. Import the required classes:
import com.amazonaws.services.identitymanagement.AmazonIdentityManagement;
import com.amazonaws.services.identitymanagement.AmazonIdentityManagementClientBuilder;
import com.amazonaws.services.identitymanagement.model.CreateUserRequest;
import com.amazonaws.services.identitymanagement.model.CreateUserResult;
import com.amazonaws.services.identitymanagement.model.DeleteConflictException;
import com.amazonaws.services.identitymanagement.model.DeleteUserRequest;
import com.amazonaws.services.identitymanagement.model.ListUsersRequest;
import com.amazonaws.services.identitymanagement.model.ListUsersResult;
import com.amazonaws.services.identitymanagement.model.User;
  1. Create and initialize a client object of AmazonIdentityManagement type:
private final AmazonIdentityManagement iamClient;

public IAMService() {
iamClient = AmazonIdentityManagementClientBuilder.defaultClient();
}
  1. Write code for creating a user in a method:
CreateUserRequest request = new CreateUserRequest().withUserName(userName);
CreateUserResult response = iamClient.createUser(request);
// get user details from response.
  1. Write code for checking if a user is present in another method:
 boolean done = false;
ListUsersRequest request = new ListUsersRequest();
while (!done) {
ListUsersResult response = iamClient.listUsers(request);

for (User user : response.getUsers()) {
if (user.getUserName().equals(userName)) {
//return success message
}
}
request.setMarker(response.getMarker());
if (!response.getIsTruncated()) {
done = true;
}
}
// return error message
  1. Write code for deleting a user in another method:
DeleteUserRequest request = new DeleteUserRequest()
.withUserName(userName);
try {
iamClient.deleteUser(request);
} catch (DeleteConflictException e) {
// Handle exception
}

Let us now see how to create a handler.

  1. Create a handler class with input and output POJOs:
public final class HelloWorldLambdaHandler implements RequestHandler<IAMOperationRequest, IAMOperationResponse> {
  1. Implement the handleRequest method with a switch statement to invoke an appropriate service method:
public IAMOperationResponse handleRequest(final IAMOperationRequest request, final Context context) {
context.getLogger().log("Requested operation = " + request.getOperation()
+ ". User name = " + request.getUserName());

switch (request.getOperation()) {
case "CREATE" :
return this.service.createUser(request.getUserName());
case "CHECK" :
return this.service.checkUser(request.getUserName());
case "DELETE" :
return this.service.deleteUser(request.getUserName());

default:
return new IAMOperationResponse(null,
"Invalid operation " + request.getOperation()
+ ". Allowed: CREATE, CHECK, DELETE.");
}
}
  1. Package the dependencies into an uber JAR using mvn clean package.

Two JARs will be created: one with only class files (starting with original-) and an Uber JAR with all dependencies (starting with serverless-). We will use the Uber JAR in this recipe.

  1. Upload the JAR to S3:
aws s3 cp target/serverless-cookbook-iam-operations-0.0.1-SNAPSHOT.jar s3://serverless-cookbook/iam-operations-0.0.1-SNAPSHOT.jar --profile admin
  1. Create a CloudFormation template for our lambda function.

You need to create a role with a trust policy that allows our Lambda to assume the role. You also need to create a policy with CloudFormation and IAM permissions.

We need to add permissions for IAM operations in our policies:

- Effect: Allow
Action:
- iam:CreateUser
- iam:DeleteUser
- iam:ListUsers
Resource:
- Fn::Sub: arn:aws:iam::${AWS::AccountId}:user/*

We have used a pseudo-parameter, AWS::AccountId, within a sub-intrinsic function to dynamically populate the account ID. I also improved the CloudWatch logging permission policy from the previous recipe using the pseudo-parameters:

- Effect: Allow
Action:
- logs:CreateLogStream
Resource:
- Fn::Sub: arn:${AWS::Partition}:logs:${AWS::Region}:${AWS::AccountId}:log-group:/aws/lambda/aws-sdk-iam-with-cf-cli:*
- Effect: Allow
Action:
- logs:PutLogEvents
Resource:
- Fn::Sub: arn:${AWS::Partition}:logs:${AWS::Region}:${AWS::AccountId}:log-group:/aws/lambda/aws-sdk-iam-with-cf-cli:*:*

You should be able to complete this recipe by referring to the previous recipe, Your First Lambda using CloudFormation.

The completed template file is available in the resources folder as cf-template-iam-operations.yml.
  1. Upload the CloudFormation template to S3:
aws s3 cp ../resources/cf-template-iam-operations.yml s3://serverless-cookbook/cf-template-iam-operations.yml --profile admin
  1. Create a CloudFormation stack using the CloudFormation template from AWS CLI:
aws cloudformation create-stack --stack-name myteststack --template-url https://s3.amazonaws.com/serverless-cookbook/cf-template-iam-operations.yml --capabilities CAPABILITY_NAMED_IAM --profile admin

This immediately responds with StackId. Note that you used a parameter, --capabilities CAPABILITY_NAMED_IAM. This is a security-related precaution. You are explicitly telling CloudFormation that you know what you are doing.

You can check the status of stack creation using the describe-stacks command:

aws cloudformation describe-stacks --stack-name <StackId> --profile admin

StackStatus: CREATE_COMPLETE means stack creation was successful.

  1. Verify the deployment with AWS CLI Lambda invoke:
aws lambda invoke --invocation-type RequestResponse --function-name aws-sdk-iam-with-cf-cli --log-type Tail --payload '{"operation":"CREATE", "userName":"abcd"}' --profile admin outputfile.txt

You can replace CREATE in the payload with CHECK for checking if the user was created, and DELETE for deleting the user.

  1. Delete the CloudFormation stack:
aws cloudformation delete-stack --stack-name <StackId> --profile admin

How it works...

AWS SDKs are used to interact with AWS services programmatically. There are SDKs available for programming languages such as Java, .Net, Node.js. PHP, Python, Ruby, Browser, Go, and C++.

We uploaded our CloudFormation template to S3 and provided the location using --template-url. You can also specify the template contents directly or from a file using file:// with another option --template-body.

We created our roles for Lambda manually. If we are using Management console, we can create custom Lambda roles from within our Lambda create function page, or directly from IAM.

We used one new intrinsic function in our CloudFormation template, Fn::Sub. Fn::Sub, which substitutes variables in an input string with values that you specify. We used it to substitute the AWS Account ID and a few other values rather than hard-coding them.

We also used the following pseudo-parameters: AWS::AccountId, AWS::Partition, and AWS::Region, which represents the current account ID, partition, and region respectively. For most regions, the partition is aws. For resources in other partitions, the partition is named as aws-partitionn (for instance, aws-cn for China and aws-us-gov for the AWS GovCloud (US) region). Using pseudo-parameters lets us avoid worrying about the actual partition name.

There's more...

We used only basic IAM operations in this recipe. You can check the documentation and implement more complex operations from within Lambda code if interested.

We will use CloudFormation and AWS CLI for most of our recipes. However, you may follow these steps to try to do the same in the management console. Doing things visually will help you remember the concepts for a longer time.

Pseudo-parameters

Pseudo-parameters are predefined parameters provided by AWS CLoudFormation. You can use them within a Ref or a Sub function to dynamically populate values. Pseudo-parameters available to use within a CloudFormation template include AWS::AccountId, AWS::NotificationARNs, AWS::NoValue, AWS::Partition, AWS::Region, AWS::StackId, AWS::StackName, and AWS::URLSuffix.

Read more about pseudo-parameters at https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/pseudo-parameter-reference.html.

See also

Dev Practices – dependency injection and unit testing

In this recipe, I will implement some of the common dev practices for creating Lambdas, such as using lightweight frameworks for dependency injection and writing unit tests for your code.

For dependency injection, we will use Guice, which is one of the dependency injection (IoC) frameworks suggested by AWS at https://docs.aws.amazon.com/lambda/latest/dg/best-practices.html. For unit testing, we will use JUnit and Mockito libraries.

Getting ready

You need an active AWS account, and read and follow the Getting started section of the recipes, Your first AWS Lambda and Your first Lambda with AWS CLI to set up Java, Maven, the parent project, serverless-cookbook-parent-aws-java, and AWS CLI, and other code usage guidelines.

This recipe also assumes you are familiar with general software development concepts and practices such as dependency injection, unit testing, and coding to interfaces. Familiarity with libraries such as JUnit and Mockito will be good to have.

Code refactoring

We will be improving the code we created in the Using AWS SDK, Amazon CloudWatch and AWS CLI with Lambda recipe. Before doing Dependency Injection, you need to refactor your code to follow the principle of programming to interfaces.

Refactor the service class into an interface and its implementation. I will also add lombok's @AllArgsConstructor annotation to generate an all args constructor, which will be used during unit testing to inject the mock object.

  1. We will first create an interface IAMService:
/**
* Interface for IAM operations.
*/
public interface IAMService {

We will define the corresponding implementation as IAMServiceImpl:

/**
* Implementation of {@link IAMService}.
*/
@AllArgsConstructor
public class IAMServiceImpl implements IAMService {
  1. Extract the methods as well, and then replace the usage of the implementation with an interface:
private IAMService service;

public MyLambdaHandler() {
service = new IAMServiceImpl();
}
Most IDEs will provide refactoring support to extract an interface from an implementation. IDEs will also help you in replacing the usages of your implementation with interface wherever possible.

How to do it...

Let us do dependency injection with Guice, which is a lightweight framework suggested by AWS.

  1. Add Maven dependency for Guice:
<dependency>
<groupId>com.google.inject</groupId>
<artifactId>guice</artifactId>
<version>4.2.0</version>
</dependency>
  1. Create the Guice configuration class to bind interfaces to implementation:
public class ApplicationModule extends AbstractModule {
protected final void configure() {
bind(IAMService.class).to(IAMServiceImpl.class);
}
}
  1. Configure the handler class for using Guice:
public final class MyLambdaHandler implements RequestHandler<IAMOperationRequest, IAMOperationResponse> {

private static final Injector INJECTOR =
Guice.createInjector(new ApplicationModule());

private IAMService service;

public MyLambdaHandler() {
INJECTOR.injectMembers(this);
Objects.requireNonNull(service);
}

@Inject
public void setService(final IAMService service) {
this.service = service;
}

I created a static Injector class and initialized it with our Guice configuration class. I added a default constructor to add this class to be injected by Guice. Objects.requireNonNull verifies if the implementation was injected successfully. I annotated it with Java's @Inject annotation for Guice to inject dependency.

Let us write unit tests for our code.

  1. Add Maven dependency for JUnit and Mockito:
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>

<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>2.21.0</version>
<scope>test</scope>
</dependency>
  1. Create a simple test class for the handler that checks if the service implementation is injected:
package tech.heartin.books.serverlesscookbook;

import org.junit.Test;

public class MyLambdaHandlerTest {
@Test
public void testDependencies() throws Exception {
MyLambdaHandler testHandler = new MyLambdaHandler();
}
}
  1. Create a test class for the service class that uses Mockito to mock AWS calls:
@RunWith(MockitoJUnitRunner.class)
public class IAMServiceImplTest {

@Mock
private AmazonIdentityManagement iamClient;

private IAMService service;

@Before
public void setUp() {
service = new IAMServiceImpl(iamClient);
Objects.requireNonNull(service);
}
// Actual tests not shown here
}
  1. Add the test method for create user:
@Test
public void testCreateUser() {
IAMOperationResponse expectedResponse = new IAMOperationResponse(
"Created user test_user", null);
when(iamClient.createUser(any()))
.thenReturn(new CreateUserResult()
.withUser(new User().withUserName("test_user")));
IAMOperationResponse actualResponse
= service.createUser("test_user");
Assert.assertEquals(expectedResponse, actualResponse);
}
  1. Add the test method to check user:
@Test
public void testCheckUser() {
IAMOperationResponse expectedResponse = new IAMOperationResponse(
"User test_user exist", null);
when(iamClient.listUsers(any()))
.thenReturn(getListUsersResult());
IAMOperationResponse actualResponse
= service.checkUser("test_user");
Assert.assertEquals(expectedResponse, actualResponse);
}

private ListUsersResult getListUsersResult() {
ListUsersResult result = new ListUsersResult();
result.getUsers().add(new User().withUserName("test_user"));
  1. Add the test method to delete user:
@Test
public void testDeleteUser() {
IAMOperationResponse expectedResponse = new IAMOperationResponse(
"Deleted user test_user", null);
when(iamClient.deleteUser(any()))
.thenReturn(new DeleteUserResult());
IAMOperationResponse actualResponse
= service.deleteUser("test_user");
Assert.assertEquals(expectedResponse, actualResponse);
}
  1. To Package, deploy, and verify, follow the Using AWS SDK, Amazon CloudFormation and AWS CLI with Lambda recipe, and package, deploy, and verify by invoking the Lambda.
In real-world projects, you may follow the Test Driven Development (TDD) principle and write tests before actual code.

How it works...

We added a lightweight dependency injection framework, Guice, and modified code to incorporate it. We also used JUnit and Mockito to do unit testing of the code. Going deep into the working of Guice, JUnit, or Mockito is outside the scope of this book. But, you may ask any questions on the open source repository for the project (given in the introduction in Chapter 1, Getting Started with Serverless Computing on AWS).

There's more...

You may also use Dagger instead of Guice for dependency injection. Dagger is also a recommended framework from AWS for lightweight dependency injection. You can technically use Spring for dependency injection, but it is not recommended because of its bigger size.

You may use TestNG instead of JUnit for unit testing. TestNG provides additional features such as DataProviders. DataProviders allow you to supply an array with all possible inputs and their expected values for a single test method. With JUnit, you will have to write a test method per input combination. You may also use Hamcrest to create more flexible expressions in tests.

See also

  • You may refer to other books at PacktPub to become familiar with the dependency injection and testing frameworks.

Your first Lambda with serverless framework

Serverless is an open source command line utility framework for building and deploying serverless applications. Serverless supports multiple cloud providers such as Amazon Web Services, Microsoft Azure, IBM OpenWhisk, Google Cloud Platform, Kubeless, Spotinst, Webtasks, and Fn.

In this recipe, we will use the Serverless framework to develop, deploy, invoke, check logs, and finally remove a simple hello world Lambda function on the AWS cloud platform.

Getting ready

Two dependencies are needed for the Serverless framework: node.js and AWS CLI. For installing AWS CLI, you may refer to the 'Deploying and Invoking Lambda with AWS CLI' recipe. You can install node using node packet as given at https://nodejs.org/en/download/package-manager.

You need to create a user for Serverless in AWS. It is a general practice to use the name serverless-admin and give administrator permission. It is not a very good practice to create users with administrator access, but currently that is the easiest way to work with Serverless. You should be careful about storing and using these credentials.

How to do it...

Let us create a simple Lambda using the Serverless framework:

  1. Install Serverless in your machine using npm:
npm install -g serverless
  1. Configure Serverless with user credentials:
serverless config credentials --provider aws --key <access key> --secret <secret access key> --profile serverless-admin

You should get a success message stating that keys were stored under the serverless-admin profile.

The sls command is the shorthand of the Serverless command.
  1. Create a Lambda function based on Java and Maven:
sls create --template aws-java-maven --path hello-world-java-maven

It creates a hello-world-java-maven folder, with pom.xml and serverless.yml files, and the src folder. You may open this Maven project in your IDE of choice. The auto-generated files looks as shown here in my IDE:

As you can see, Serverless has created a bit more than a simple hello world. Serverless takes care of most of the things we did manually, including creating a role, setting memory, setting timeout, and so on.

Add a user profile and region to serverless.yml. The region is optional if you are using the default region:

Build the jar file with:

mvn clean package
  1. Deploy the jar file to AWS:
sls deploy -v

You can log in to the AWS console and verify the new Lambda service. From the log statements, you can see that Serverless framework internally makes use of CloudFormation. You can verify the same from AWS Management console.

  1. Invoke the function from sls:
sls invoke -f hello -l

Option -f specifies the function name, and -l specifies that logs need to be printed to terminal. The function name to invoke is hello and is available in the serverless.yml file. You can see the output and logs on the terminal.

  1. Checking logs from the CLI:
sls logs -f hello -t

Option -f specifies the function name and -t denotes to tail the logs. You can now run the invoke command from the other terminal and see the logs being printed.

  1. Now, clean up everything:
sls remove
  1. Log in to AWS Management console and verify that everything is cleaned up.

How it works...

Serverless framework internally makes use of AWS CloudFormation for provisioning AWS resources. You can log in to Management console, go to CloudFormation service, select the stack named hello-world-java-maven-dev, and click on the Template tab for viewing the complete CloudFormation template.

You can further click on the View/Edit template in Designer option to see the template visually. The designer view of the CloudFormation template created for our example by the Serverless framework is shown here:

There's more...

Serverless framework is part of the serverless.com Serverless Platform. The other two components of the serverless platform are Serverless dashboard and event gateway. Serverless framework also integrates well with other processes and tools, such as CI and CD.

See also

Left arrow icon Right arrow icon
Download code icon Download Code

Key benefits

  • Build serverless applications with AWS Lambda, AWS CloudFormation and AWS CloudWatch
  • Perform data analytics and natural language processing(NLP)on the AWS serverless platform
  • Explore various design patterns and best practices involved in serverless computing

Description

Managing physical servers will be a thing of the past once you’re able to harness the power of serverless computing. If you’re already prepped with the basics of serverless computing, Serverless Programming Cookbook will help you take the next step ahead. This recipe-based guide provides solutions to problems you might face while building serverless applications. You'll begin by setting up Amazon Web Services (AWS), the primary cloud provider used for most recipes. The next set of recipes will cover various components to build a Serverless application including REST APIs, database, user management, authentication, web hosting, domain registration, DNS management, CDN, messaging, notifications and monitoring. The book also introduces you to the latest technology trends such as Data Streams, Machine Learning and NLP. You will also see patterns and practices for using various services in a real world application. Finally, to broaden your understanding of Serverless computing, you'll also cover getting started guides for other cloud providers such as Azure, Google Cloud Platform and IBM cloud. By the end of this book, you’ll have acquired the skills you need to build serverless applications efficiently using various cloud offerings.

Who is this book for?

For developers looking for practical solutions to common problems while building a serverless application, this book provides helpful recipes. To get started with this intermediate-level book, knowledge of basic programming is a must.

What you will learn

  • Serverless computing in AWS and explore services with other clouds
  • Develop full-stack apps with API Gateway, Cognito, Lambda and DynamoDB
  • Web hosting with S3, CloudFront, Route 53 and AWS Certificate Manager
  • SQS and SNS for effective communication between microservices
  • Monitoring and troubleshooting with CloudWatch logs and metrics
  • Explore Kinesis Streams, Amazon ML models and Alexa Skills Kit

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Jan 31, 2019
Length: 490 pages
Edition : 1st
Language : English
ISBN-13 : 9781788621533
Vendor :
Amazon
Concepts :
Tools :

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning

Product Details

Publication date : Jan 31, 2019
Length: 490 pages
Edition : 1st
Language : English
ISBN-13 : 9781788621533
Vendor :
Amazon
Concepts :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
$19.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
$199.99 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts
$279.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total $ 120.97
Serverless Architectures with AWS
$32.99
Advanced Serverless Architectures with Microsoft Azure
$38.99
Serverless Programming Cookbook
$48.99
Total $ 120.97 Stars icon

Table of Contents

11 Chapters
Getting Started with Serverless Computing on AWS Chevron down icon Chevron up icon
Building Serverless REST APIs with API Gateway Chevron down icon Chevron up icon
Data Storage with Amazon DynamoDB Chevron down icon Chevron up icon
Application Security with Amazon Cognito Chevron down icon Chevron up icon
Web Hosting with S3, Route53, and CloudFront Chevron down icon Chevron up icon
Messaging and Notifications with SQS and SNS Chevron down icon Chevron up icon
Redshift, Amazon ML, and Alexa Skills Chevron down icon Chevron up icon
Monitoring and Alerting with Amazon CloudWatch Chevron down icon Chevron up icon
Serverless Programming Practices and Patterns Chevron down icon Chevron up icon
Other Cloud Providers Chevron down icon Chevron up icon
Other Books You May Enjoy Chevron down icon Chevron up icon

Customer reviews

Most Recent
Rating distribution
Full star icon Full star icon Full star icon Full star icon Half star icon 4.6
(15 Ratings)
5 star 86.7%
4 star 0%
3 star 6.7%
2 star 0%
1 star 6.7%
Filter icon Filter
Most Recent

Filter reviews by




D. Van Camp Feb 09, 2020
Full star icon Full star icon Full star icon Empty star icon Empty star icon 3
So, I have mixed reactions to this book, having now been working on creating content for AWS for some months now and getting rather frustrated.First, I like the concept of this book ... an easy to follow collection of 'cookbook' recipes with step-by-step instructions is a great idea. However there are some problems with the execution.From least to worst:- LEAST: AWS, apparently keeps changing it's UI's, so have to figure things out a bit when that is the case... ok, not surprising,- MORE SO: Kind of hard to figure which 'recipe' it is I'm looking for not really knowing the names/purposes of the many services, so the table of contents isn't always so helpful.For example, today I'm working on how to publish a test lambda function so I can call it via HTTP GET ... ended up using an Amazon tutorial, but that tutorial is using Node.js code for the lambda pasted into an online editor, so now I have to figure out how to change that to an uploaded compiled java app... not really clear! And haven't found any good 'recipe' in this book (maybe in here? I'm not seeing it yet.!)- SERIOUS: So, while there might be good 'recipes' for the above and other cases.. they really don't seem well marked to me so I can find them quickly. Which is why I bought the book! To save me time!!- MORE SERIOUS: The author uses a set of libraries to help encapsulate and automate many necessary imports needed to enable testings, advanced features and so on. And this is GREAT! However, one is left with the question of EXACTLY WHAT is in these imported libraries and what to do when newer versions of various components become available.This one hit me HARD. I had started out wanting my project to use the 'latest & greatest', including Java 11. The library includes two other libraries and Java 8 is rather forced in these. After downloading the sources for the other 2 libraries, it seems I had just needed to set a property in the maven pom, but that was FAR from obvious. And, having not actually tested that, I'm just presuming ... I ended up forcing the version in ALL of the downloaded components.And versions of many other component libraries is also buried deep in sub-libraries! Documentation on what these libraries are and what is needed to declare (properties) alternate versions is very much needed big time!The book should have a section that describes WHAT components are included in these convenience libraries, sumerize what they do and explain how to incrementally upgrade when needed. This is a SERIOUS lacking of this book (and the associated code download sites.)MOST SEVERE: While I really like(d) the concept of this book and want it to be successful (and commend the author for giving a good first try!) The lack of any overview discussion of the various compnents of AWS discussed in the book and where/when they should be considered is a MAJOR deficiency, I'm afraid. This is one hell of a complex, diverse environment! ... Was it CloudFront or CloudWatch or Cloud???? that does what I want .. or maybe Route 53 ... why 53??? Should I care? ... etc, etc. ...Ways to make it easier to find what I need to find what I actually want to DO would be what I need a book like this the most for! Make it easy to find the solution to a problem ... I don't care what the technological name is that some company, individual or whatever has chosen for it!I want to know: What do I need, When do I need it and How do I use it effectively.So, sorry, but only 3 stars, one for best intentions.dvc
Amazon Verified review Amazon
Anup H Sep 15, 2019
Full star icon Empty star icon Empty star icon Empty star icon Empty star icon 1
kindle edition of this book is not proper. please check and update this book. I want to return this kindle and buy the paperback. please help
Amazon Verified review Amazon
Kannan Aug 20, 2019
Full star icon Full star icon Full star icon Full star icon Full star icon 5
It's a nice perspective on the upcoming serverless technology. It will be good for Java guys and non Java guys can grasp the concepts.
Amazon Verified review Amazon
Ashutosh Dash Jul 21, 2019
Full star icon Full star icon Full star icon Full star icon Full star icon 5
It's a great book
Amazon Verified review Amazon
Deni May 02, 2019
Full star icon Full star icon Full star icon Full star icon Full star icon 5
I always think why buy books when they are available to read on many websites with subscription. But this one is different. The book support program from author at his website is very good. It is like joining a course on Serverless computing and AWS cloud.
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

How do I buy and download an eBook? Chevron down icon Chevron up icon

Where there is an eBook version of a title available, you can buy it from the book details for that title. Add either the standalone eBook or the eBook and print book bundle to your shopping cart. Your eBook will show in your cart as a product on its own. After completing checkout and payment in the normal way, you will receive your receipt on the screen containing a link to a personalised PDF download file. This link will remain active for 30 days. You can download backup copies of the file by logging in to your account at any time.

If you already have Adobe reader installed, then clicking on the link will download and open the PDF file directly. If you don't, then save the PDF file on your machine and download the Reader to view it.

Please Note: Packt eBooks are non-returnable and non-refundable.

Packt eBook and Licensing When you buy an eBook from Packt Publishing, completing your purchase means you accept the terms of our licence agreement. Please read the full text of the agreement. In it we have tried to balance the need for the ebook to be usable for you the reader with our needs to protect the rights of us as Publishers and of our authors. In summary, the agreement says:

  • You may make copies of your eBook for your own use onto any machine
  • You may not pass copies of the eBook on to anyone else
How can I make a purchase on your website? Chevron down icon Chevron up icon

If you want to purchase a video course, eBook or Bundle (Print+eBook) please follow below steps:

  1. Register on our website using your email address and the password.
  2. Search for the title by name or ISBN using the search option.
  3. Select the title you want to purchase.
  4. Choose the format you wish to purchase the title in; if you order the Print Book, you get a free eBook copy of the same title. 
  5. Proceed with the checkout process (payment to be made using Credit Card, Debit Cart, or PayPal)
Where can I access support around an eBook? Chevron down icon Chevron up icon
  • If you experience a problem with using or installing Adobe Reader, the contact Adobe directly.
  • To view the errata for the book, see www.packtpub.com/support and view the pages for the title you have.
  • To view your account details or to download a new copy of the book go to www.packtpub.com/account
  • To contact us directly if a problem is not resolved, use www.packtpub.com/contact-us
What eBook formats do Packt support? Chevron down icon Chevron up icon

Our eBooks are currently available in a variety of formats such as PDF and ePubs. In the future, this may well change with trends and development in technology, but please note that our PDFs are not Adobe eBook Reader format, which has greater restrictions on security.

You will need to use Adobe Reader v9 or later in order to read Packt's PDF eBooks.

What are the benefits of eBooks? Chevron down icon Chevron up icon
  • You can get the information you need immediately
  • You can easily take them with you on a laptop
  • You can download them an unlimited number of times
  • You can print them out
  • They are copy-paste enabled
  • They are searchable
  • There is no password protection
  • They are lower price than print
  • They save resources and space
What is an eBook? Chevron down icon Chevron up icon

Packt eBooks are a complete electronic version of the print edition, available in PDF and ePub formats. Every piece of content down to the page numbering is the same. Because we save the costs of printing and shipping the book to you, we are able to offer eBooks at a lower cost than print editions.

When you have purchased an eBook, simply login to your account and click on the link in Your Download Area. We recommend you saving the file to your hard drive before opening it.

For optimal viewing of our eBooks, we recommend you download and install the free Adobe Reader version 9.