Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds
Azure Serverless Computing Cookbook
Azure Serverless Computing Cookbook

Azure Serverless Computing Cookbook: Build applications hosted on serverless architecture using Azure Functions

Arrow left icon
Profile Icon Praveen Kumar Sreeram
Arrow right icon
$19.99 per month
Full star icon Full star icon Full star icon Full star icon Half star icon 4.5 (2 Ratings)
Paperback Aug 2017 332 pages 1st Edition
eBook
$35.98 $39.99
Paperback
$48.99
Subscription
Free Trial
Renews at $19.99p/m
Arrow left icon
Profile Icon Praveen Kumar Sreeram
Arrow right icon
$19.99 per month
Full star icon Full star icon Full star icon Full star icon Half star icon 4.5 (2 Ratings)
Paperback Aug 2017 332 pages 1st Edition
eBook
$35.98 $39.99
Paperback
$48.99
Subscription
Free Trial
Renews at $19.99p/m
eBook
$35.98 $39.99
Paperback
$48.99
Subscription
Free Trial
Renews at $19.99p/m

What do you get with a Packt Subscription?

Free for first 7 days. $19.99 p/m after that. Cancel any time!
Product feature icon Unlimited ad-free access to the largest independent learning library in tech. Access this title and thousands more!
Product feature icon 50+ new titles added per month, including many first-to-market concepts and exclusive early access to books as they are being written.
Product feature icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Product feature icon Thousands of reference materials covering every tech concept you need to stay up to date.
Subscribe now
View plans & pricing
Table of content icon View table of contents Preview book icon Preview Book

Azure Serverless Computing Cookbook

Accelerate Your Cloud Application Development Using Azure Function Triggers and Bindings

In this chapter, we will cover the following recipes:

  • Building a backend Web API using HTTP triggers
  • Persisting employee details using Azure Storage table output bindings
  • Saving the profile images to Queues using Queue output bindings
  • Storing the image in Azure Blob storage
  • Cropping an image using ImageResizer trigger

Introduction

Every software application needs backend components that are responsible for taking care of the business logic and storing the data into some kind of storage such as database, filesystem, and so on. Each of these backend components could be developed using different technologies. Azure serverless technology also allows us to develop these backend APIs using Azure Functions.

Azure Functions provide many out-of-the-box templates that solves most of the common problems such as connecting to storage, building Web APIs, cropping the images, and so on. In this chapter, we will learn how to use these built-in templates. Along with learning the concepts related to Azure serverless computing, we will also try to implement a solution to a basic domain problem of creating components required for any organization to manage the internal employee information.

Below is a simple diagram that helps you understand what we will be going to achieve in this chapter:

Building a backend Web API using HTTP triggers

We will use Azure serverless architecture for building a Web API using HTTP triggers. These HTTP triggers could be consumed by any frontend application that is capable of making HTTP calls.

Getting ready

Let's start our journey of understanding Azure serverless computing using Azure Functions by creating a basic backend Web API that responds to HTTP requests:

We will be using C# as the programming language throughout the book.

How to do it…

  1. Navigate to the Function App listing page. Choose the function app in which you would like to add a new function.
  2. Create a new function by clicking on the + icon as shown in the following screenshot:
  1. If you have created a brand new function, then clicking on the + icon in the preceding step, you would see the Get started quickly with a premade function page. Please click on the create your own custom functions link to navigate to the page where you can see all the built-in templates for creating your Azure Functions.
  2. In the Choose a template below or go to the quickstart section, choose HTTPTrigger-CSharp as shown in the following screenshot to create a new HTTP trigger function:
  1. Provide a meaningful name. For this example, I have used RegisterUser as the name of the Azure Function.
  2. In the Authorization level drop-down, choose the Anonymous option as shown in the following screenshot. We will learn more about the all the authorization levels in Chapter 9, Implement Best Practices for Azure Functions:
  1. Once you provide the name and choose the Authorization level, click on Create button to create the HTTP trigger function.
  2. As soon as you create the function, all the required code and configuration files will be created automatically and the run.csx file will be opened for you to edit the code. Remove the default code and replace it with the following code:
        using System.Net;
public static async Task<HttpResponseMessage>
Run(HttpRequestMessage req, TraceWriter log)
{
string firstname=null,lastname = null;
dynamic data = await req.Content.ReadAsAsync<object>();
firstname = firstname ?? data?.firstname;
lastname = data?.lastname;
return (lastname + firstname) == null ?
req.CreateResponse(HttpStatusCode.BadRequest,
"Please pass a name on the query string or in the
request body")
:
req.CreateResponse(HttpStatusCode.OK, "Hello " +
firstname + " " + lastname);
}
  1. Save the changes by clicking on the Save button available just above the code editor.
  2. Let's try to test the RegisterUser function using the Test console. Click on the tab named Test as shown in the following screenshot to open the Test console:
  1. Enter the values for firstname and lastname, in the Request body section as shown in the following screenshot:

Please make sure you select POST in the HTTP method drop-down.

  1. Once you have reviewed the input parameters, click on the Run button available at the bottom of the Test console as shown in the following screenshot:
  1. If the input request workload is passed correctly with all the required parameters, you will see a Status 200 OK, and the output in the Output window will be as shown in the preceding screenshot.

How it works…

We have created the first basic Azure Function using HTTP triggers and made a few modifications to the default code. The code just accepts firstname and lastname parameters and prints the name of the end user with a Hello {firstname} {lastname} message as a response. We have also learnt how to test the HTTP trigger function right from the Azure Management portal.

For the sake of simplicity, I didn't perform validations of the input parameter. Please make sure that you validate all the input parameters in your applications running on your production environment.

See also

  • The Enabling authorization for function apps recipe in Chapter 9, Implement Best Practices for Azure Functions

Persisting employee details using Azure Storage table output bindings

In the previous recipe, you have learnt how to create an HTTP trigger and accept the input parameters. Let's now work on something interesting, that is, where you store the input data into a persistent medium. Azure Functions supports us to store data in many ways. For this example, we will store the data in Azure Table storage.

Getting ready

In this recipe, you will learn how easy it is to integrate an HTTP trigger and the Azure Table storage service using output bindings. The Azure HTTP trigger function receives the data from multiple sources and stores the user profile data in a storage table named tblUserProfile.

How to do it...

  1. Navigate to the Integrate tab of the RegisterUser HTTP trigger function.
  2. Click on the New Output button and select Azure Table Storage then click on the Select button:
  1. Once you click on the Select button in the previous step, you will be prompted to choose the following settings of the Azure Table storage output bindings:
    • Table parameter name: This is the name of the parameter that you will be using in the Run method of the Azure Function. For this example, please provide objUserProfileTable as the value.
    • Table name: A new table in the Azure Table storage will be created to persist the data. If the table doesn't exist already, Azure will automatically create one for you! For this example, please provide tblUserProfile as the table name.
    • Storage account connection: If you don't see the Storage account connection string, click on the new (shown in the following screenshot) to create a new one or to choose an existing storage account.
    • The Azure Table storage output bindings should be as shown in the following screenshot:
  1. Click on Save to save the changes.
  1. Navigate to the code editor by clicking on the function name and paste the following code:
        #r "Microsoft.WindowsAzure.Storage"
using System.Net;
using Microsoft.WindowsAzure.Storage.Table;

public static async Task<HttpResponseMessage>
Run(HttpRequestMessage req,TraceWriter
log,CloudTable objUserProfileTable)
{
dynamic data = await
req.Content.ReadAsAsync<object>();
string firstname= data.firstname;
string lastname=data.lastname;

UserProfile objUserProfile = new UserProfile(firstname,
lastname);
TableOperation objTblOperationInsert =
TableOperation.Insert(objUserProfile);
objUserProfileTable.Execute(objTblOperationInsert);

return req.CreateResponse(HttpStatusCode.OK,
"Thank you for Registering..");
}

public class UserProfile : TableEntity
{
public UserProfile(string lastName, string firstName)
{
this.PartitionKey = "p1";
this.RowKey = Guid.NewGuid().ToString();;
this.FirstName = firstName;
this.LastName = lastName;
}
public UserProfile() { }
public string FirstName { get; set; }
public string LastName { get; set; }
}
  1. Let's execute the function by clicking on the Run button of the Test tab by passing firstname and lastname parameters in the Request body as shown in the following screenshot:
  1. If everything went well, you should get a Status 200 OK message in the Output box as shown in the preceding screenshot. Let's navigate to Azure Storage Explorer and view the table storage to see if the table named tblUserProfile was created successfully:

How it works...

Azure Functions allows us to easily integrate with other Azure services just by adding an output binding to the trigger. For this example, we have integrated the HTTP trigger with the Azure Storage table binding and also configured the Azure Storage account by providing the storage connection string and the Azure Storage table name in which we would like to create a record for each of the HTTP requests received by the HTTP trigger.

We have also added an additional parameter for handling the table storage named objUserProfileTable, of type CloudTable, to the Run method. We can perform all the operations on the Azure Table storage using objUserProfileTable.

For the sake of explanation the input parameters are not validated in the code sample. However, in your production environment, it's important that you should validate them before storing in in any kind of persist medium.

We have also created an object of UserProfile, and filled it with the values received in the request object, and then passed it to a table operation. You can learn more about handling operations on Azure Table storage service from the URL https://docs.microsoft.com/en-us/azure/storage/storage-dotnet-how-to-use-tables.

Understanding more about Storage Connection

When you create a new storage connection (please refer to the third step of the How to do it... section of this recipe) a new App settings will be created as shown in the following screenshot:

You can navigate to the App settings by clicking on Application settings of the Platform features tab as shown in the following screenshot:

What is Azure Table storage service?

Partition key and row key

The primary key of Azure Table storage tables has two parts as follows:

  • Partition key: Azure Table storage records are classified and organized into partitions. Each record located in a partition will have the same partition key (p1 in our example).
  • Row key: A unique value should be assigned for each of the rows.
A clustered index will be created with the values of the partition key and row key to improve the query performance.

There's more...

Following is the very first line of the code in this recipe:

#r "Microsoft.WindowsAzure.Storage"

The preceding line of code instructs the function runtime to include a reference to the specified library to the current context.

Saving the profile images to Queues using Queue output bindings

In the previous recipe, you have learnt how to receive two string parameters firstname and lastname in the Request body, and store them in the Azure Table storage. In this recipe, you will learn how to receive a URL of an image and save the same in the Blob container of an Azure Storage account.

We could have processed the downloaded user profile image in the recipe Persisting employee details using Azure Storage table output bindings. However, keeping in mind the size of the profile pictures, the processing of images on the fly in the HTTP requests might hinder the performance of the function. For that reason, we will just grab the URL of the profile picture and store it in Queue, and later we can process the image and store it in the Blob.

Getting ready

We will be updating the code of the RegisterUser function that we have used in the previous recipes.

How to do it…

  1. Navigate to the Integrate tab of the RegisterUser HTTP trigger function.
  2. Click on the New Output button and select Azure Queue Storage then click on the Select button.
  3. Provide the following parameters in the Azure Queue Storage output settings:
    • Queue name: Set the value of the Queue name as userprofileimagesqueue
    • Storage account connection: Please make sure that you select the right storage account in the Storage account connection field
    • Message parameter name: Set the name of the parameter to objUserProfileQueueItem which will be used in the Run method
  4. Click on Save to the create the new output binding.
  1. In this recipe, we will look at another approach of grabbing the request parameters for which we will use the Newtonsoft.JSON library to parse the JSON data. Let's navigate to the View files tab as shown in the following screenshot:
  1. As shown in the preceding screenshot, click on Add to add a new file. Please make sure that you name it as project.json as shown in the preceding screenshot.
  2. Once the file is created, add the following code to the project.json file. The following code adds the reference of the Newtonsoft.Json library.
        {
"frameworks" : {
"net46": {
"dependencies":{
"Newtonsoft.Json" : "10.0.2"
}
}
}
}
  1. Navigate back to the code editor by clicking on the function name (RegisterUser in this example) and paste the following code:
        #r "Microsoft.WindowsAzure.Storage"
using System.Net;
using Microsoft.WindowsAzure.Storage.Table;
using Newtonsoft.Json;
public static void Run(HttpRequestMessage req,
TraceWriter log,
CloudTable
objUserProfileTable,
out string
objUserProfileQueueItem
)
{
var inputs = req.Content.ReadAsStringAsync().Result;
dynamic inputJson = JsonConvert.DeserializeObject<dynamic>
(inputs);

string firstname= inputJson.firstname;
string lastname=inputJson.lastname;
string profilePicUrl = inputJson.ProfilePicUrl;

objUserProfileQueueItem = profilePicUrl;
UserProfile objUserProfile = new UserProfile(firstname,
lastname, profilePicUrl);
TableOperation objTblOperationInsert =
TableOperation.Insert(objUserProfile);

objUserProfileTable.Execute(objTblOperationInsert);
}

public class UserProfile : TableEntity
{
public UserProfile(string lastname, string firstname,
string profilePicUrl)
{
this.PartitionKey = "p1";
this.RowKey = Guid.NewGuid().ToString();
this.FirstName = firstname;
this.LastName = lastname;
this.ProfilePicUrl = profilePicUrl;
}
public UserProfile() { }
public string FirstName { get; set; }
public string LastName { get; set; }
public string ProfilePicUrl {get; set;}
}
  1. Click on Save to save the code changes in the code editor of the run.csx file.
  1. Let's test the code by adding another parameter ProfilePicUrl to the Request body shown as follows then click on the Run button in the Test tab of the Azure Function code editor window: The image used in the below JSON might not exist when you are reading this book. So, Please make sure that you provide a valid URL of the image.
        {
"firstname": "Bill",
"lastname": "Gates",
"ProfilePicUrl":"https://upload.wikimedia.org/wikipedia/
commons/1/19/Bill_Gates_June_2015.jpg"
}
  1. If everything goes fine you will see the Status : 200 OK message, then the image URL that you have passed as an input parameter in the Request body will be created as a Queue message in the Azure Storage Queue service. Let's navigate to Azure Storage Explorer, and view the Queue named userprofileimagesqueue, which is the Queue name that we have provided in the Step 3. Following is the screenshot of the Queue message that was created:

How it works…

In this recipe, we have added Queue message output binding and made the following changes to the code:

  • Added a reference to the Newtonsoft.Json NuGet library in the project.json file
  • Added a new parameter named out string objUserProfileQueueItem which is used to bind the URL of the profile picture as a Queue message content
  • We have also made the Run method synchronous by removing async as it doesn't allow us to have out parameters

There's more…

The project.json file contains all the references of the external libraries that we may use in the Azure Function.

At the time of writing, Azure Function Runtime only supports .NET Framework 4.6.

See also

  • The Persisting employee details using Azure Storage table Output Bindings recipe

Storing the image in Azure Blob storage

Let's learn how to invoke an Azure Function when a new queue item is added to the Azure Storage Queue service. Each message in the Queue is the URL of the profile picture of a user which will be processed by the Azure Functions and will be stored as a Blob in the Azure Storage Blob service.

Getting ready

In the previous recipe, we have learnt how to create Queue output bindings. In this recipe, you will grab the URL from the Queue, create a byte array, and then write it to a Blob.

This recipe is a continuation of the previous recipes. Please make sure that you implement them.

How to do it...

  1. Create a new Azure Function by choosing the QueueTrigger-C# from the templates.
  2. Provide the following details after choosing the template:
    • Name your function: Please provide a meaningful name such as CreateProfilePictures.
    • Queue name: Name of the Queue which should be monitored by the Azure Function. Our previous recipe created a new item for each of the valid requests coming to the HTTP trigger (named RegisterUser) into the userprofileimagesqueue Queue. For each new entry of a queue message to this Queue storage, the CreateProfilePictures trigger will be executed automatically.
    • Storage account connection: Connection of the storage account where the Queues are located.
  3. Review all the details, and click on Create to create the new function.
  4. Navigate to Integrate tab then click on New Output then choose Azure Blob Storage then click on the Select button.
  5. In the Azure Blob Storage output section, provide the following:
    • Blob parameter name: Set it to outputBlob
    • Path: Set it to userprofileimagecontainer/{rand-guid}
    • Storage account connection: Choose the storage account where you would like to save the Blobs:
  1. Once you provide all the preceding details, click on the Save button to save all the changes.
  2. Replace the default code of the run.csx file with the following code:
        using System;
public static void Run(Stream outputBlob,string myQueueItem,
TraceWriter log)
{
byte[] imageData = null;
using (var wc = new System.Net.WebClient())
{
imageData = wc.DownloadData(myQueueItem);
}
outputBlob.WriteAsync(imageData,0,imageData.Length);
}
  1. Click on the Save button to save the changes.
  2. Let's go back to the RegisterUser function and test it by providing firstname, lastname, and ProfilePicUrl fields as we did in the Saving the profile images to Queues using Queue output bindings recipe.
  3. Now, navigate to the Azure Storage Explorer, and look at the Blob container userprofileimagecontainer. You will find a new Blob as shown in the following screenshot:
  1. You can view the image in any tool (such as MS Paint or Internet Explorer).

How it works...

We have created a Queue trigger that gets executed as and when a new message arrives in the Queue. Once it finds a new Queue message, then it reads the message, and as we know the message is a URL of a profile picture. The function makes a web client request and downloads the image data in the form of byte array, and then writes the data into the Blob which is configured as an output Blob

There's more...

The parameter rand-guid, will generate a new GUID and is assigned to the Blob that gets created each time the trigger is fired.

It is mandatory to specify the Blob container name in the Path parameter of the Blob storage output binding while configuring the Blob storage output. Azure Functions creates one automatically if it doesn't exist.

You can use Queue messages only when you would like to store messages which are up to 64 KB. If you would like to store the messages greater than 64 KB, you need to use the Azure Service Bus.

See also...

  • The Building a backend Web API using HTTP triggers recipe
  • The Persisting employee details using Azure Storage table output bindings recipe
  • The Saving the profile images to Queues using Queue output bindings recipe
  • The Storing the image in Azure Blob storage recipe

Cropping an image using ImageResizer trigger

In the recent times, with the evolution of smart phones with high-end cameras, it's easy to capture a high-quality picture of huge sizes. It's good to have good quality pictures to refresh our memories. However, as an application developer or administrator, it would be a pain to manage the storage when your website is popular and you expect most of the users to get registered with a high-quality profile picture. So, it makes sense to use some libraries that could reduce the size of the high-quality images and crop them without losing the aspect ratio so that the quality of the image doesn't get reduced.

In this recipe, we will learn how to implement the functionality of cropping the image and reducing the size without losing the quality using one of the built-in Azure Function templates named ImageResizer .

Getting ready

In this recipe, you will learn how to use a library named ImageResizer. We will be using the library for resizing the image with the required dimensions. For the sake of simplicity, we will crop the image to the following sizes:

  • Medium with 200*200 pixels
  • Small with 100*100 pixels

How to do it...

  1. Create a new Azure Function by choosing the Samples in the Scenario drop-down as shown in the following screenshot:
  1. Select the ImageResizer-CSharp template as shown in the preceding screenshot.
  2. Once you have selected the template, the portal prompts you to choose the following parameters:
    • Name your Function: Provide a meaningful name. For this example, I have provided CropProfilePictures.
    • Azure Blob Storage trigger (image):
      • Path: Provide the path of the container (in our case userprofileimagecontainer) which contains all the blobs that are created by the Queue trigger. CreateProfilePictures in the previous recipe
      • Storage account connection: Select the connection string of the storage account where the container and Blobs are stored
    • Azure Blob Storage output (imageMedium):
      • Path: Please provide the name of the container where the resized images of size medium 200*200 are to be stored. In this case, userprofileimagecontainer-md.
      • Storage account connection: Select the connection string of the storage account where the Blobs are stored.
    • Azure Blob Storage output (imageSmall):
      • Path: Please provide the name of the container where the resized images of size small 100*100 are to be stored. In this case, userprofileimagecontainer-sm.
      • Storage account connection: Select the connection string of the storage account where the Blobs are stored.
  1. Review all the details and click on Create as shown in the following screenshot:
  1. Fortunately, the ImageResizer Azure Function template provides most of the necessary code for our requirement of resizing the image. I just made a few minor tweaks. Replace the default code with the following code and the code should be self-explanatory:
        using ImageResizer;

public static void Run(
Stream image, Stream imageSmall, Stream imageMedium)
{
var imageBuilder = ImageResizer.ImageBuilder.Current;
var size = imageDimensionsTable[ImageSize.Small];
imageBuilder.Build(image, imageSmall, new ResizeSettings
(size.Item1, size.Item2, FitMode.Max, null), false);
image.Position = 0;
size = imageDimensionsTable[ImageSize.Medium];
imageBuilder.Build(image, imageMedium, new ResizeSettings
(size.Item1, size.Item2, FitMode.Max, null), false);
}

public enum ImageSize
{
Small, Medium
}

private static Dictionary<ImageSize, Tuple<int, int>>
imageDimensionsTable = new Dictionary<ImageSize, Tuple<int,
int>>()
{
{ ImageSize.Small, Tuple.Create(100, 100) },
{ ImageSize.Medium, Tuple.Create(200, 200) }
};
  1. Let's run a test on the RegisterUser function by submitting a sample request with firstname, lastname, and a ProfilePicUrl. I have used the same inputs that we have used in our previous recipes.
  1. In the Azure Storage Explorer, I can see two new Blob containers userprofileimagecontainer-md and userprofileimagecontainer-sm as shown in the following screenshot:
  1. I can even view the corresponding cropped versions in each of those containers. Following are the three versions of the image that we have used as input:

How it works...

We have created a new function using one of the samples named ImageResizer that the Azure Function template provides.

The ImageResizer template takes input from userprofileimagecontainer Blob container where the original Blobs reside. Whenever a new Blob is created in the userprofileimagecontainer Blob, the function will create two resized versions in each of the userprofileimagecontainer-md and userprofileimagecontainer-sm containers automatically.

Following is a simple diagram that shows how the execution of the functions is triggered like a chain:

See also

    • The Building a backend Web API using HTTP triggers recipe
    • The Persisting employee details using Azure Storage table output bindings recipe
    • The Saving profile picture path to Azure Storage Queues using Queue output bindings recipe
    • The Storing the image in Azure Blob storage recipe
Left arrow icon Right arrow icon
Download code icon Download Code

Key benefits

  • •Enhance Azure Functions with continuous deployment using Visual Studio Team Services
  • •Learn to deploy and manage cost-effective and highly available serverless applications using Azure Functions
  • •This recipe-based guide will teach you to build a robust serverless environment

Description

Microsoft provides a solution to easily run small segment of code in the Cloud with Azure Functions. Azure Functions provides solutions for processing data, integrating systems, and building simple APIs and microservices. The book starts with intermediate-level recipes on serverless computing along with some use cases on benefits and key features of Azure Functions. Then, we’ll deep dive into the core aspects of Azure Functions such as the services it provides, how you can develop and write Azure functions, and how to monitor and troubleshoot them. Moving on, you’ll get practical recipes on integrating DevOps with Azure functions, and providing continuous integration and continous deployment with Visual Studio Team Services. It also provides hands-on steps and tutorials based on real-world serverless use cases, to guide you through configuring and setting up your serverless environments with ease. Finally, you’ll see how to manage Azure functions, providing enterprise-level security and compliance to your serverless code architecture. By the end of this book, you will have all the skills required to work with serverless code architecture, providing continuous delivery to your users.

Who is this book for?

If you are a Cloud administrator, architect, or developer who wants to build scalable systems and deploy serverless applications with Azure functions, then this book is for you. Prior knowledge and hands-on experience with core services of Microsoft Azure is required.

What you will learn

  • ?Develop different event-based handlers supported by serverless architecture supported by Microsoft Cloud Platform – Azure
  • ?Integrate Azure Functions with different Azure Services to develop Enterprise-level applications
  • ?Get to know the best practices in organizing and refactoring the code within the Azure functions
  • ?Test, troubleshoot, and monitor the Azure functions to deliver high-quality, reliable, and robust cloud-centric applications
  • ?Automate mundane tasks at various levels right from development to deployment and maintenance
  • ?Learn how to develop statefulserverless applications and also self-healing jobs using DurableFunctions

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Aug 17, 2017
Length: 332 pages
Edition : 1st
Language : English
ISBN-13 : 9781788390828
Vendor :
Microsoft
Concepts :
Tools :

What do you get with a Packt Subscription?

Free for first 7 days. $19.99 p/m after that. Cancel any time!
Product feature icon Unlimited ad-free access to the largest independent learning library in tech. Access this title and thousands more!
Product feature icon 50+ new titles added per month, including many first-to-market concepts and exclusive early access to books as they are being written.
Product feature icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Product feature icon Thousands of reference materials covering every tech concept you need to stay up to date.
Subscribe now
View plans & pricing

Product Details

Publication date : Aug 17, 2017
Length: 332 pages
Edition : 1st
Language : English
ISBN-13 : 9781788390828
Vendor :
Microsoft
Concepts :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
$19.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
$199.99 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts
$279.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total $ 152.97
Azure Serverless Computing Cookbook
$48.99
Azure for Architects
$48.99
Serverless computing in Azure with .NET
$54.99
Total $ 152.97 Stars icon

Table of Contents

10 Chapters
Accelerate Your Cloud Application Development Using Azure Function Triggers and Bindings Chevron down icon Chevron up icon
Working with Notifications Using SendGrid and Twilio Services Chevron down icon Chevron up icon
Seamless Integration of Azure Functions with Other Azure Services Chevron down icon Chevron up icon
Understanding the Integrated Developer Experience of Visual Studio Tools for Azure Functions Chevron down icon Chevron up icon
Exploring Testing Tools for the Validation of Azure Functions Chevron down icon Chevron up icon
Monitoring and Troubleshooting Azure Serverless Services Chevron down icon Chevron up icon
Code Reusability and Refactoring the Code in Azure Functions Chevron down icon Chevron up icon
Developing Reliable and Durable Serverless Applications Using Durable Functions Chevron down icon Chevron up icon
Implement Best Practices for Azure Functions Chevron down icon Chevron up icon
Implement Continuous Integration and Deployment of Azure Functions Using Visual Studio Team Services Chevron down icon Chevron up icon

Customer reviews

Rating distribution
Full star icon Full star icon Full star icon Full star icon Half star icon 4.5
(2 Ratings)
5 star 50%
4 star 50%
3 star 0%
2 star 0%
1 star 0%
David BB Aug 22, 2017
Full star icon Full star icon Full star icon Full star icon Full star icon 5
We are all challenged with the opportunity to migrate more and more applications and databases to the cloud, but there remain many questions which must be vetted before large scale moves can be successfully completed. This book provides much helpful information in planning, preparation and practical migration. ​
Amazon Verified review Amazon
mogulman Oct 18, 2018
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
Overall it contains a lot of good information but is very dated even though it is a little over a year old. I'm currently creating a business process app that is very involved. I'm using SQL Server. The internet is littered with bits of information and it is often conflicting. It is nice to see some consolidation. He covers a lot of development using the portal using .csx files. Anyone doing serious development will use Visual Studio. He covers some development in Visual Studio. If he does an update I'd like to see the book focus on Visual Studio development. I'd also like to see development using .NET Core 2.0. I'd like to see using Microsoft Graph, creating small text files and attach them to emails, code sharing in Visual Studio development....
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

What is included in a Packt subscription? Chevron down icon Chevron up icon

A subscription provides you with full access to view all Packt and licnesed content online, this includes exclusive access to Early Access titles. Depending on the tier chosen you can also earn credits and discounts to use for owning content

How can I cancel my subscription? Chevron down icon Chevron up icon

To cancel your subscription with us simply go to the account page - found in the top right of the page or at https://subscription.packtpub.com/my-account/subscription - From here you will see the ‘cancel subscription’ button in the grey box with your subscription information in.

What are credits? Chevron down icon Chevron up icon

Credits can be earned from reading 40 section of any title within the payment cycle - a month starting from the day of subscription payment. You also earn a Credit every month if you subscribe to our annual or 18 month plans. Credits can be used to buy books DRM free, the same way that you would pay for a book. Your credits can be found in the subscription homepage - subscription.packtpub.com - clicking on ‘the my’ library dropdown and selecting ‘credits’.

What happens if an Early Access Course is cancelled? Chevron down icon Chevron up icon

Projects are rarely cancelled, but sometimes it's unavoidable. If an Early Access course is cancelled or excessively delayed, you can exchange your purchase for another course. For further details, please contact us here.

Where can I send feedback about an Early Access title? Chevron down icon Chevron up icon

If you have any feedback about the product you're reading, or Early Access in general, then please fill out a contact form here and we'll make sure the feedback gets to the right team. 

Can I download the code files for Early Access titles? Chevron down icon Chevron up icon

We try to ensure that all books in Early Access have code available to use, download, and fork on GitHub. This helps us be more agile in the development of the book, and helps keep the often changing code base of new versions and new technologies as up to date as possible. Unfortunately, however, there will be rare cases when it is not possible for us to have downloadable code samples available until publication.

When we publish the book, the code files will also be available to download from the Packt website.

How accurate is the publication date? Chevron down icon Chevron up icon

The publication date is as accurate as we can be at any point in the project. Unfortunately, delays can happen. Often those delays are out of our control, such as changes to the technology code base or delays in the tech release. We do our best to give you an accurate estimate of the publication date at any given time, and as more chapters are delivered, the more accurate the delivery date will become.

How will I know when new chapters are ready? Chevron down icon Chevron up icon

We'll let you know every time there has been an update to a course that you've bought in Early Access. You'll get an email to let you know there has been a new chapter, or a change to a previous chapter. The new chapters are automatically added to your account, so you can also check back there any time you're ready and download or read them online.

I am a Packt subscriber, do I get Early Access? Chevron down icon Chevron up icon

Yes, all Early Access content is fully available through your subscription. You will need to have a paid for or active trial subscription in order to access all titles.

How is Early Access delivered? Chevron down icon Chevron up icon

Early Access is currently only available as a PDF or through our online reader. As we make changes or add new chapters, the files in your Packt account will be updated so you can download them again or view them online immediately.

How do I buy Early Access content? Chevron down icon Chevron up icon

Early Access is a way of us getting our content to you quicker, but the method of buying the Early Access course is still the same. Just find the course you want to buy, go through the check-out steps, and you’ll get a confirmation email from us with information and a link to the relevant Early Access courses.

What is Early Access? Chevron down icon Chevron up icon

Keeping up to date with the latest technology is difficult; new versions, new frameworks, new techniques. This feature gives you a head-start to our content, as it's being created. With Early Access you'll receive each chapter as it's written, and get regular updates throughout the product's development, as well as the final course as soon as it's ready.We created Early Access as a means of giving you the information you need, as soon as it's available. As we go through the process of developing a course, 99% of it can be ready but we can't publish until that last 1% falls in to place. Early Access helps to unlock the potential of our content early, to help you start your learning when you need it most. You not only get access to every chapter as it's delivered, edited, and updated, but you'll also get the finalized, DRM-free product to download in any format you want when it's published. As a member of Packt, you'll also be eligible for our exclusive offers, including a free course every day, and discounts on new and popular titles.