Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Amazon S3 Cookbook (n)
Amazon S3 Cookbook (n)

Amazon S3 Cookbook (n): Over 30 hands-on recipes that will get you up and running with Amazon Simple Storage Service (S3) efficiently

eBook
€22.99 €32.99
Paperback
€41.99
Subscription
Free Trial
Renews at €18.99p/m

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Shipping Address

Billing Address

Shipping Methods
Table of content icon View table of contents Preview book icon Preview Book

Amazon S3 Cookbook (n)

Chapter 1. Managing Common Operations with AWS SDKs

We will cover the basic operations of AWS SDKs to understand what they can do with Amazon S3 with the official AWS SDK sample application code to create S3 buckets, and upload, list, get, and download objects into and from a bucket.

In this chapter, we will cover:

  • Learning AWS SDK for Java and basic S3 operations with sample code
  • Learning AWS SDK for Node.js and basic S3 operations with sample code
  • Learning AWS SDK for Python and basic S3 operations with sample code
  • Learning AWS SDK for Ruby and basic S3 operations with sample code
  • Learning AWS SDK for PHP and basic S3 operations with sample code

Introduction

Amazon Simple Storage Service (Amazon S3) is a cloud object storage service provided by Amazon Web Services. As Amazon S3 does not have a minimum fee, we just pay for what we store. We can store and get any amount of data, known as objects, in S3 buckets in different geographical regions through API or several SDKs. AWS SDKs provide programmatic access, for example, multiply uploading objects, versioning objects, configuring object access lists, and so on.

Amazon Web Services provides the following SDKs at http://aws.amazon.com/developers/getting-started/:

  • AWS SDK for Android
  • AWS SDK for JavaScript (Browser)
  • AWS SDK for iOS
  • AWS SDK for Java
  • AWS SDK for .NET
  • AWS SDK for Node.js
  • AWS SDK for PHP
  • AWS SDK for Python
  • AWS SDK for Ruby

Learning AWS SDK for Java and basic S3 operations with sample code

This section tells you about how to configure an IAM user to access S3 and install AWS SDK for Java. It also talks about how to create S3 buckets, put objects, and get objects using the sample code. It explains how the sample code runs as well.

Getting ready

AWS SDK for Java is a Java API for AWS and contains AWS the Java library, code samples, and Eclipse IDE support. You can easily build scalable applications by working with Amazon S3, Amazon Glacier, and more.

To get started with AWS SDK for Java, it is necessary to install the following on your development machine:

  • J2SE Development Kit 6.0 or later
  • Apache Maven

How to do it…

First, we set up an IAM user, create a user policy, and attach the policy to the IAM user in the IAM management console in order to securely allow the IAM user to access the S3 bucket. We can define the access control for the IAM user by configuring the IAM policy. Then, we install AWS SDK for Java by using Apache Maven and git.

Tip

For more information about AWS Identity and Access Management (IAM), refer to http://aws.amazon.com/iam/.

There are two ways to install AWS SDK for Java, one is to get the source code from GitHub, and the other is to use Apache Maven. We use the latter because the official site recommends this way and it is much easier.

  1. Sign in to the AWS management console and move to the IAM console at https://console.aws.amazon.com/iam/home.
  2. In the navigation panel, click on Users and then on Create New Users.
    How to do it…
  3. Enter the username and select Generate an access key for each user, then click on Create.
    How to do it…
  4. Click on Download Credentials and save the credentials. We will use the credentials composed of Access Key ID and Secret Access Key to access the S3 bucket.
    How to do it…
  5. Select the IAM user.
    How to do it…
  6. Click on Attach User Policy.
    How to do it…
  7. Click on Select Policy Template and then click on Amazon S3 Full Access.
    How to do it…
  8. Click on Apply Policy.
    How to do it…

Next, we clone a repository for the S3 Java sample application and run the application by using the Maven command (mvn). First, we set up the AWS credentials to operate S3, clone the sample application repository from GitHub, and then build and run the sample application using the mvn command:

  1. Create a credential file and put the access key ID and the secret access key in the credentials. You can see the access key ID and the secret access key in the credentials when we create an IAM user and retrieve the CSV file in the management console:
    $ vim ~/.aws/credentials
    [default]
    aws_access_key_id = <YOUR_ACCESS_KEY_ID>
    aws_secret_access_key = <YOUR_SECRET_ACCESS_KEY>

    Tip

    Downloading the example code

    You can download the example code files from your account at http://www.packtpub.com for all the Packt Publishing books you have purchased. If you purchased this book elsewhere, you can visit http://www.packtpub.com/support and register to have the files e-mailed directly to you.

  2. Download the sample SDK application:
    $ git clone https://github.com/awslabs/aws-java-sample.git
    $ cd aws-java-sample/
    
  3. Run the sample application:
    $ mvn clean compile exec:java
    

How it works…

The sample code works as shown in the following diagram; initiating the credentials to allow access Amazon S3, creating and listing a bucket in a region, putting, getting, and listing objects into the bucket, and then finally deleting the objects, and then the bucket:

How it works…

Now, let's run the sample application and see the output of the preceding command, as shown in the following screenshot, and then follow the source code step by step:

How it works…

Then, let's examine the sample code at aws-java-sample/src/main/java/com/amazonaws/samples/S3Sample.java.

The AmazonS3Client method instantiates an AWS service client with the default credential provider chain (~/.aws/credentials). Then, the Region.getRegion method retrieves a region object, and chooses the US West (Oregon) region for the AWS client:

AmazonS3 s3 = new AmazonS3Client();
Region usWest2 = Region.getRegion(Regions.US_WEST_2);
s3.setRegion(usWest2);

Tip

Amazon S3 creates a bucket in a region you specify and is available in several regions. For more information about S3 regions, refer to http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region.

The createBucket method creates an S3 bucket with the specified name in the specified region:

s3.createBucket(bucketName);

The listBuckets method lists and gets the bucket name:

for (Bucket bucket : s3.listBuckets()) {
System.out.println(" - " + bucket.getName());

The putObject method uploads objects into the specified bucket. The objects consist of the following code:

s3.putObject(new PutObjectRequest(bucketName, key, createSampleFile()));

The getObject method gets the object stored in the specified bucket:

S3Object object = s3.getObject(new GetObjectRequest(bucketName, key));

The ListObjects method returns a list of summary information of the object in the specified bucket:

ObjectListing objectListing = s3.listObjects(new ListObjectsRequest()

The deleteObject method deletes the specified object in the specified bucket.

The reason to clean up objects before deleting the S3 bucket is that, it is unable to remove an S3 bucket with objects. We need to remove all objects in an S3 bucket first and then delete the bucket:

s3.deleteObject(bucketName, key);

The deleteBucket method deletes the specified bucket in the region.

The AmazonServiceException class provides the error messages, for example, the request ID, HTTP status code, the AWS error code, for a failed request from the client in order to examine the request. On the other hand, the AmazonClientException class can be used for mainly providing error responses when the client is unable to get a response from AWS resources or successfully make the service call, for example, a client failed to make a service call because no network was present:

s3.deleteBucket(bucketName);
} catch (AmazonServiceException ase) {
  System.out.println("Caught an AmazonServiceException, which means your request made it " + "to Amazon S3, but was rejected with an error response for some reason.");
  System.out.println("Error Message:  " + ase.getMessage());
  System.out.println("HTTP Status Code:  " + ase.getStatusCode());
  System.out.println("AWS Error Code:  " + ase.getErrorCode());
  System.out.println("Error Type:  " + ase.getErrorType());
  System.out.println("Request ID:  " + ase.getRequestId());
  } catch (AmazonClientException ace) {
    System.out.println("Caught an AmazonClientException, which means the client encountered " + "a serious internal problem while trying to communicate with S3," + "such as not being able to access the network.");
    System.out.println("Error Message: " + ace.getMessage());

See also

Learning AWS SDK for Node.js and basic S3 operations with sample code

This section introduces you about how to install AWS SDK for Node.js and how to create S3 buckets, put objects, get objects using the sample code, and explains how the sample code runs as well.

Getting ready

AWS SDK for JavaScript is available for browsers and mobile services, on the other hand Node.js supports as backend. Each API call is exposed as a function on the service.

To get started with AWS SDK for Node.js, it is necessary to install the following on your development machine:

How to do it…

Proceed with the following steps to install the packages and run the sample application. The preferred way to install SDK is to use npm, the package manager for Node.js.

  1. Download the sample SDK application:
    $ git clone https://github.com/awslabs/aws-nodejs-sample.git
    $ cd aws-nodejs-sample/
    
  2. Run the sample application:
    $ node sample.js
    

How it works…

The sample code works as shown in the following diagram; initiating the credentials to allow access to Amazon S3, creating a bucket in a region, putting objects into the bucket, and then, finally, deleting the objects and the bucket. Make sure that you delete the objects and the bucket yourself after running this sample application because the application does not delete the bucket:

How it works…

Now, let's run the sample application and see the output of the command, as shown in the following screenshot, and then follow the source code step by step:

How it works…

Now, let's examine the sample code; the path is aws-nodejs-sample/sample.js. The AWS.S3 method creates an AWS client:

var s3 = new AWS.S3();

The createBucket method creates an S3 bucket with the specified name defined as the bucketName variable. The bucket is created in the standard US region, by default. The putObject method uploads an object defined as the keyName variable into the bucket:

var bucketName = 'node-sdk-sample-' + uuid.v4();
var keyName = 'hello_world.txt';
s3.createBucket({Bucket: bucketName}, function() {
  var params = {Bucket: bucketName, Key: keyName, Body: 'Hello World!'};
  s3.putObject(params, function(err, data) {
    if (err)
      console.log(err)
    else
      console.log("Successfully uploaded data to " + bucketName + "/" + keyName);
  });
});

The whole sample code is as follows:

var AWS = require('aws-sdk');
var uuid = require('node-uuid');
var s3 = new AWS.S3();
var bucketName = 'node-sdk-sample-' + uuid.v4();
var keyName = 'hello_world.txt';
s3.createBucket({Bucket: bucketName}, function() {
  var params = {Bucket: bucketName, Key: keyName, Body: 'Hello World!'};
  s3.putObject(params, function(err, data) { 
    if (err)
      console.log(err)
    else    
      console.log("Successfully uploaded data to " + bucketName + "/" + keyName);
  });
});

See also

Learning AWS SDK for Python and basic S3 operations with sample code

This section introduces you about how to install AWS SDK for Python and how to create S3 buckets, put objects, get objects using the sample code, and explains how the sample code runs as well.

Getting ready

Boto, a Python interface, is offered by Amazon Web Services and all of its features work with Python 2.6 and 2.7. The next major version to support Python 3.3 is underway.

To get started with AWS SDK for Python (Boto), it is necessary to install the following on your development machine:

How to do it…

Proceed with the following steps to install the packages and run the sample application:

  1. Download the sample SDK application:
    $ git clone https://github.com/awslabs/aws-python-sample.git
    $ cd aws-python-sample/
    
  2. Run the sample application:
    $ python s3_sample.py
    

How it works…

The sample code works as shown in the following diagram; initiating the credentials to allow access to Amazon S3, creating a bucket in a region, putting and getting objects into the bucket, and then finally deleting the objects and the bucket.

How it works…

Now, let's run the sample application and see the output of the preceding command, as shown in the following screenshot, and then follow the source code step by step:

How it works…

Now, let's examine the sample code; the path is aws-python-sample/s3-sample.py.

The connect_s3 method creates a connection for accessing S3:

s3 = boto.connect_s3()

The create_bucket method creates an S3 bucket with the specified name defined as the bucket_name variable in the standard US region by default:

bucket_name = "python-sdk-sample-%s" % uuid.uuid4()
print "Creating new bucket with name: " + bucket_name
bucket = s3.create_bucket(bucket_name)

By creating a new key object, it stores new data in the bucket:

from boto.s3.key import Key 
k = Key(bucket)
k.key = 'python_sample_key.txt'

The delete method deletes all the objects in the bucket:

k.delete()

The delete_bucket method deletes the bucket:

bucket.s3.delete_bucket(bucket_name)

The whole sample code is as follows:

import boto
import uuid
s3 = boto.connect_s3()
bucket_name = "python-sdk-sample-%s" % uuid.uuid4()
print "Creating new bucket with name: " + bucket_name
bucket = s3.create_bucket(bucket_name)
from boto.s3.key import Key
k = Key(bucket)
k.key = 'python_sample_key.txt'
print "Uploading some data to " + bucket_name + " with key: " + k.key
k.set_contents_from_string('Hello World!')
expires_in_seconds = 1800
print "Generating a public URL for the object we just uploaded. This URL will be active for %d seconds" % expires_in_seconds
print
print k.generate_url(expires_in_seconds)
print
raw_input("Press enter to delete both the object and the bucket...")
print "Deleting the object."
k.delete()
print "Deleting the bucket."
s3.delete_bucket(bucket_name)

See also

Learning AWS SDK for Ruby and basic S3 operations with sample code

This section introduces you about how to install AWS SDK for Ruby and how to create S3 buckets, put objects, get objects using the sample code, and explains how the sample code runs as well.

Getting ready

The AWS SDK for Ruby provides a Ruby API operation and enables developer help complicated coding by providing Ruby classes. New users should start with AWS SDK for Ruby V2, as officially recommended.

To get started with AWS SDK for Ruby, it is necessary to install the following on your development machine:

How to do it…

Proceed with the following steps to install the packages and run the sample application. We install the stable AWS SDK for Ruby v2 and download the sample code.

  1. Download the sample SDK application:
    $ git clone https://github.com/awslabs/aws-ruby-sample.git
    $ cd aws-ruby-sample/
    
  2. Run the sample application:
    $ bundle install
    $ ruby s3_sample.rb
    

How it works…

The sample code works as shown in the following diagram; initiating the credentials to allow access to Amazon S3, creating a bucket in a region, putting and getting objects into the bucket, and then finally deleting the objects and the bucket.

How it works…

Now, let's run the sample application and see the output of the preceding command, which is shown in the following screenshot, and then follow the source code step by step:

How it works…

Now, let's examine the sample code; the path is aws-ruby-sample/s3-sample.rb.

The AWS::S3.new method creates an AWS client:

s3 = AWS::S3.new

The s3.buckets.create method creates an S3 bucket with the specified name defined as the bucket_name variable in the standard US region by default:

uuid = UUID.new
bucket_name = "ruby-sdk-sample-#{uuid.generate}"
bucket = s3.bucket(bucket_name)
bucket.create

The objects.put method puts an object defined as the objects variable in the bucket:

object = bucket.object('ruby_sample_key.txt')
object.put(body: "Hello World!")

The object.public_url method creates a public URL for the object:

puts object.public_url

The object.url_for(:read) method creates a public URL to read an object:

puts object.url_for(:read)

The bucket.delete! method deletes all the objects in a bucket, and then deletes the bucket:

bucket.delete!

The whole sample code is as follows:

#!/usr/bin/env ruby
require 'rubygems'
require 'bundler/setup'
require 'aws-sdk'
require 'uuid'
s3 = Aws::S3::Resource.new(region: 'us-west-2')
uuid = UUID.new
bucket_name = "ruby-sdk-sample-#{uuid.generate}"
bucket = s3.bucket(bucket_name)
bucket.create
object = bucket.object('ruby_sample_key.txt')
object.put(body: "Hello World!")
puts "Created an object in S3 at:" 
puts object.public_url
puts "\nUse this URL to download the file:"
puts object.presigned_url(:get)
puts "(press any key to delete both the bucket and the object)"
$stdin.getc
puts "Deleting bucket #{bucket_name}"
bucket.delete!

See also

Learning AWS SDK for PHP and basic S3 operations with sample code

This section introduces you about how to install AWS SDK for PHP and how to create S3 buckets, put objects, get objects using the sample code, and explains how the sample code runs as well.

Getting ready

AWS SDK for PHP is a powerful tool for PHP developers to quickly build their stable applications.

To get started with AWS SDK for PHP, it is necessary to install the following on your development machine:

It is recommended to use Composer to install AWS SDK for PHP because it is much easier than getting the source code.

How to do it…

Proceed with the following steps to install the packages and run the sample application:

  1. Download the sample SDK application:
    $ git clone https://github.com/awslabs/aws-php-sample.git
    $ cd aws-php-sample/
    
  2. Run the sample application:
    $ php sample.php
    

How it works…

The sample code works as shown in the following diagram; initiating the credentials to allow access to Amazon S3, creating a bucket in a region, putting and getting objects into the bucket, and then finally deleting the objects and the bucket:

How it works…

Now, let's run the sample application and see the output of the preceding command, as shown in the following screenshot, and then follow the source code step by step:

How it works…

Now, let's examine the sample code; the path is aws-php-sample/sample.php.

The s3Client::facory method creates an AWS client and is the easiest way to get up and running:

$client = S3Client::factory();

The createBucket method creates an S3 bucket with the specified name in a region defined in the credentials file:

$result = $client->createBucket(array(
    'Bucket' => $bucket
));

The PutOjbect method uploads objects into the bucket:

$key = 'hello_world.txt';
$result = $client->putObject(array(
    'Bucket' => $bucket,
    'Key'    => $key,
    'Body'   => "Hello World!"
));

The getObject method retrieves objects from the bucket:

$result = $client->getObject(array(
    'Bucket' => $bucket,
    'Key'    => $key
));

The deleteObject method removes objects from the bucket:

$result = $client->deleteObject(array(
    'Bucket' => $bucket,
    'Key'    => $key
));

The deleteBucket method deletes the bucket:

$result = $client->deleteBucket(array(
    'Bucket' => $bucket
));

The whole sample code is as follows:

<?php
use Aws\S3\S3Client;
$client = S3Client::factory();
$bucket = uniqid("php-sdk-sample-", true);
echo "Creating bucket named {$bucket}\n";
$result = $client->createBucket(array(
    'Bucket' => $bucket
));
$client->waitUntilBucketExists(array('Bucket' => $bucket));
$key = 'hello_world.txt';
echo "Creating a new object with key {$key}\n";
$result = $client->putObject(array(
    'Bucket' => $bucket,
    'Key'    => $key,
    'Body'   => "Hello World!"
));
echo "Downloading that same object:\n";
$result = $client->getObject(array(
    'Bucket' => $bucket,
    'Key'    => $key
));
echo "\n---BEGIN---\n";
echo $result['Body'];
echo "\n---END---\n\n";
echo "Deleting object with key {$key}\n";
$result = $client->deleteObject(array(
    'Bucket' => $bucket,
    'Key'    => $key
));
echo "Deleting bucket {$bucket}\n";
$result = $client->deleteBucket(array(
    'Bucket' => $bucket
));

See also

Left arrow icon Right arrow icon

Description

Amazon S3 is one of the most famous and trailblazing cloud object storage services, which is highly scalable, low-latency, and economical. Users only pay for what they use and can store and retrieve any amount of data at any time over the Internet, which attracts Hadoop users who run clusters on EC2. The book starts by showing you how to install several AWS SDKs such as iOS, Java, Node.js, PHP, Python, and Ruby and shows you how to manage objects. Then, you’ll be taught how to use the installed AWS SDKs to develop applications with Amazon S3. Furthermore, you will explore the Amazon S3 pricing model and will learn how to annotate S3 billing with cost allocation tagging. In addition to this, the book covers several practical recipes about how to distribute your content with CloudFront, secure your content with IAM, optimize Amazon S3 performance, and notify S3 events with Lambada. By the end of this book, you will be successfully implementing pro-level practices, techniques, and solutions in Amazon S3.
Estimated delivery fee Deliver to Latvia

Premium delivery 7 - 10 business days

€25.95
(Includes tracking information)

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Aug 27, 2015
Length: 280 pages
Edition : 1st
Language : English
ISBN-13 : 9781785280702
Vendor :
Amazon
Languages :
Tools :

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Shipping Address

Billing Address

Shipping Methods
Estimated delivery fee Deliver to Latvia

Premium delivery 7 - 10 business days

€25.95
(Includes tracking information)

Product Details

Publication date : Aug 27, 2015
Length: 280 pages
Edition : 1st
Language : English
ISBN-13 : 9781785280702
Vendor :
Amazon
Languages :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
€18.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
€189.99 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just €5 each
Feature tick icon Exclusive print discounts
€264.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just €5 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total 99.97
Amazon EC2 Cookbook
€27.99
Amazon S3 Essentials
€29.99
Amazon S3 Cookbook (n)
€41.99
Total 99.97 Stars icon

Table of Contents

13 Chapters
1. Managing Common Operations with AWS SDKs Chevron down icon Chevron up icon
2. Hosting a Static Website on Amazon S3 Bucket Chevron down icon Chevron up icon
3. Calculating Cost with the AWS Simple Monthly Calculator Chevron down icon Chevron up icon
4. Deploying a Static Website with CloudFormation Chevron down icon Chevron up icon
5. Distributing Your Contents via CloudFront Chevron down icon Chevron up icon
6. Securing Resources with Bucket Policies and IAM Chevron down icon Chevron up icon
7. Sending Authenticated Requests with AWS SDKs Chevron down icon Chevron up icon
8. Protecting Data Using Server-side and Client-side Encryption Chevron down icon Chevron up icon
9. Enabling Cross-origin Resource Sharing Chevron down icon Chevron up icon
10. Managing Object Lifecycle to Lower the Cost Chevron down icon Chevron up icon
11. S3 Performance Optimization Chevron down icon Chevron up icon
12. Creating Triggers and Notifying S3 Events to Lambda Chevron down icon Chevron up icon
Index Chevron down icon Chevron up icon

Customer reviews

Rating distribution
Full star icon Full star icon Full star icon Empty star icon Empty star icon 3
(2 Ratings)
5 star 0%
4 star 50%
3 star 0%
2 star 50%
1 star 0%
Andrea Oct 07, 2015
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
Amazon Simple Storage Service (Amazon S3) is a cloud object storage service provided by Amazon Web Services. It’s a powerful public cloud service, usable with public cloud solutions or as a target for some on-prem servers (like backup software). But it’s also becoming a standard interface of object storage and some on-prem solutions may direcly export this service.Compared to other services this require a good understand of API and the SDK in order to use it the right way.This book help in those aspects but using practical examples and sample codes.Chapter 1, Managing Common Operations with AWS SDKs, introduces what AWS SDKs can do with Amazon S3 by using the official AWS SDK sample application code to create S3 buckets and upload, list, get, and download objects into and from a bucket.Chapter 2, Hosting a Static Website on Amazon S3 Bucket, covers a user case of hosting a static website's contents by using a custom domain on Amazon S3 instead of using web servers such as Apache or Nginx on EC2 through a management console (GUI) and AWS CLI (command line).Chapter 3, Calculating Cost with the AWS Simple Monthly Calculator, talks about calculating the total cost of storing data and delivering objects through S3, based on a couple of scenarios.Chapter 4, Deploying a Static Website with CloudFormation, came back with the web site user cases, by deploying a template of a static website with CloudFormation via the S3 console and using AWS CLI.Chapter 5, Distributing Your Contents via CloudFront, talks about delivering a static website on S3 buckets through the CloudFront edge location (CDN), configuring S3 buckets as an origin store to minimize network latency.Chapter 6, Securing Resources with Bucket Policies and IAM, covers managing access to resources such as buckets and objects, configuring bucket policies, and IAM users, groups, and policies.Chapter 7, Sending Authenticated Requests with AWS SDKs, talks about making requests using IAM and federated users' temporary credentials with AWS SDKs to grant permissions to temporarily access Amazon S3 resources.Chapter 8, Protecting Data Using Server-side and Client-side Encryption, deals with encrypting and decrypting your data using server-side and client-side encryption to securely upload and download your contents.Chapter 9, Enabling Cross-origin Resource Sharing, shows you how to enable cross-origin resource sharing (CORS) and allow cross-origin access to S3 resources to interact with resources in a different domain for client web applications.Chapter 10, Managing Object Lifecycle to Lower the Cost, talks about configuring lifetime cycle policies on S3 buckets to automatically delete after a certain time, using Reduced Redundancy Storage (RRS) or by archiving objects into Amazon Glacier.Chapter 11, S3 Performance Optimization, deals with improving the performance of uploading, downloading, and getting and listing objects.Chapter 12, Creating Triggers and Notifying S3 Events to Lambda, covers sending notifications to let AWS Lambda execute Lambda functions that enable S3 event notifications.So several specific users cases, lot of examples, but maybe was nice start with a project and implement it step by step by using all those tools.
Amazon Verified review Amazon
Jascha Casadio Oct 07, 2015
Full star icon Full star icon Empty star icon Empty star icon Empty star icon 2
Amazon S3 is one of the storage solutions offered by Amazon, as well as one of the first services of the ecosystem to be made publicly available. Despite being almost 10 years old, S3 keeps getting enriched with new features and offers scalability, high availability, and low latency at commodity costs. This makes it the obvious storage choice of everyone working with the Amazon Web Services. Still, despite this and the fact that it's there since 2006, the books dedicated to it are not many. In fact, I don't remember a single title focusing on S3, although, being one of the most basic services among those offered, any book introducing the readers to the ecosystem somehow touches it. On the other hand, it is also true that this lack of titles entirely dedicated to S3 has been compensated by the extensive, and free!, official documentation, which is easy to follow, full of examples and constantly updated. This makes it very hard for any author interested in writing a book about S3, since adding value to what Amazon already does provide is not an easy task. Still, not impossible. Amazon S3 Cookbook, a recently released title, is probably the first book entirely centered on S3.I have reread the whole S3 developers' guide recently. It's good to keep up to date with the latest whistles and bells offered by the service. Then, when I heard that this title was released at the end of August of 2015, I was very excited to get a copy and get through those almost 300 pages. Having the official documentation so fresh in my mind would have allowed me to better evaluate the value of this text. There are indeed things that the developers' guide does not cover. There were questions that I wanted answered. And I wanted this book to provide those answers. Among them, how to organize the files of the many different projects a DevOps manages into S3, through delimiters and prefixes; how to switch from a standard VCS, such as Git, to S3, allowing instances to push, pull and merge without problems; how to allow external users to access to specific resources, limiting their permissions both in space and time. Apart from these questions, there were other generic expectations. Among them, for example, a good coverage of the CLI. There are many bindings specific to different programming languages: Ruby, Java, PHP, Python. What makes everyone happy is the command line. I do honestly expect each book that doesn't explicitly claim to cover a specific language to present the examples through the CLI. Is that asking too much?Before getting into the content of the book a first quick (negative) note: the book is called a cookbook, but it is not. Cookbooks are titles that present recipes that get the reader solving a problem through the winning step-by-step approach. Amazon S3 Cookbook does not provide this typical question-answer scenario.But let's dive into the book itself! The very first thing that caught my eye is something that, unfortunately, is not new to Packt Publishing titles: faulty proofreading. Some of the commands of the examples used to present the service have, indeed, typos. Nothing fancy that the expert eye won't catch. Still, this suggests a quick and cheap proofreading which lowers the overall value of the book.Typos apart, the book feels like more about CloudFormation rather than S3. True, CloudFormation can be on top of S3 and other services, such as EC2 and Auto Scaling. Still, this book should be about S3, not CloudFormation. Chapters 4 and 5, for example, should be probably part of a different title.Another thing that I did not like is the waste of pages dedicated to do stuff through the console, rather than the CLI. In chapter 6, for example, the author takes six pages showing how to set up an IAM user that has administrator's permissions through the console. Six pages plenty of colorful screenshots that take the reader hand by hand in such a delicate and hard process. Very little space is given to do the same with the command line, which is probably the way DevOps will interact with AWS. DevOps love the command line. And coffee.Another thing that leaves a bad taste in the mouth is that concepts are not clearly explained. Let's stay on chapter 6. Chapter 6, as stated, is centered about buckets' policies and IAM. It presents different interesting scenarios of limiting access to the resources, such as granting cross-account bucket permissions. The topics covered are real-world and interesting indeed, but far from being well explained. Let's see why: "First, we create a bucket and attach a bucket policy to the bucket in Account A, and then create an IAM user in Account A and one in Account B. Lastly attach a bucket policy to the bucket owed by Account A". Attaching a bucket policy was the first thing we did, wasn't it?I must admit that when I have finished reading this book I was disappointed. It's not that most of my questions were not answered, but rather that the concepts were not clearly covered. Chapter 7, for example, which is about temporary permissions and thus among those that I was very interested in, got me more puzzled than enlightened. Overall, I don't feel like the book gives more than the official documentation does already.As usual, you can find more reviews on my personal blog: http://books.lostinmalloc.com. Feel free to pass by and share your thoughts!
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

What is the delivery time and cost of print book? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela
What is custom duty/charge? Chevron down icon Chevron up icon

Customs duty are charges levied on goods when they cross international borders. It is a tax that is imposed on imported goods. These duties are charged by special authorities and bodies created by local governments and are meant to protect local industries, economies, and businesses.

Do I have to pay customs charges for the print book order? Chevron down icon Chevron up icon

The orders shipped to the countries that are listed under EU27 will not bear custom charges. They are paid by Packt as part of the order.

List of EU27 countries: www.gov.uk/eu-eea:

A custom duty or localized taxes may be applicable on the shipment and would be charged by the recipient country outside of the EU27 which should be paid by the customer and these duties are not included in the shipping charges been charged on the order.

How do I know my custom duty charges? Chevron down icon Chevron up icon

The amount of duty payable varies greatly depending on the imported goods, the country of origin and several other factors like the total invoice amount or dimensions like weight, and other such criteria applicable in your country.

For example:

  • If you live in Mexico, and the declared value of your ordered items is over $ 50, for you to receive a package, you will have to pay additional import tax of 19% which will be $ 9.50 to the courier service.
  • Whereas if you live in Turkey, and the declared value of your ordered items is over € 22, for you to receive a package, you will have to pay additional import tax of 18% which will be € 3.96 to the courier service.
How can I cancel my order? Chevron down icon Chevron up icon

Cancellation Policy for Published Printed Books:

You can cancel any order within 1 hour of placing the order. Simply contact customercare@packt.com with your order details or payment transaction id. If your order has already started the shipment process, we will do our best to stop it. However, if it is already on the way to you then when you receive it, you can contact us at customercare@packt.com using the returns and refund process.

Please understand that Packt Publishing cannot provide refunds or cancel any order except for the cases described in our Return Policy (i.e. Packt Publishing agrees to replace your printed book because it arrives damaged or material defect in book), Packt Publishing will not accept returns.

What is your returns and refunds policy? Chevron down icon Chevron up icon

Return Policy:

We want you to be happy with your purchase from Packtpub.com. We will not hassle you with returning print books to us. If the print book you receive from us is incorrect, damaged, doesn't work or is unacceptably late, please contact Customer Relations Team on customercare@packt.com with the order number and issue details as explained below:

  1. If you ordered (eBook, Video or Print Book) incorrectly or accidentally, please contact Customer Relations Team on customercare@packt.com within one hour of placing the order and we will replace/refund you the item cost.
  2. Sadly, if your eBook or Video file is faulty or a fault occurs during the eBook or Video being made available to you, i.e. during download then you should contact Customer Relations Team within 14 days of purchase on customercare@packt.com who will be able to resolve this issue for you.
  3. You will have a choice of replacement or refund of the problem items.(damaged, defective or incorrect)
  4. Once Customer Care Team confirms that you will be refunded, you should receive the refund within 10 to 12 working days.
  5. If you are only requesting a refund of one book from a multiple order, then we will refund you the appropriate single item.
  6. Where the items were shipped under a free shipping offer, there will be no shipping costs to refund.

On the off chance your printed book arrives damaged, with book material defect, contact our Customer Relation Team on customercare@packt.com within 14 days of receipt of the book with appropriate evidence of damage and we will work with you to secure a replacement copy, if necessary. Please note that each printed book you order from us is individually made by Packt's professional book-printing partner which is on a print-on-demand basis.

What tax is charged? Chevron down icon Chevron up icon

Currently, no tax is charged on the purchase of any print book (subject to change based on the laws and regulations). A localized VAT fee is charged only to our European and UK customers on eBooks, Video and subscriptions that they buy. GST is charged to Indian customers for eBooks and video purchases.

What payment methods can I use? Chevron down icon Chevron up icon

You can pay with the following card types:

  1. Visa Debit
  2. Visa Credit
  3. MasterCard
  4. PayPal
What is the delivery time and cost of print books? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela