Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Learning Hadoop 2

You're reading from   Learning Hadoop 2 Design and implement data processing, lifecycle management, and analytic workflows with the cutting-edge toolbox of Hadoop 2

Arrow left icon
Product type Paperback
Published in Feb 2015
Publisher Packt
ISBN-13 9781783285518
Length 382 pages
Edition 1st Edition
Tools
Arrow right icon
Toc

Table of Contents (13) Chapters Close

Preface 1. Introduction 2. Storage FREE CHAPTER 3. Processing – MapReduce and Beyond 4. Real-time Computation with Samza 5. Iterative Computation with Spark 6. Data Analysis with Apache Pig 7. Hadoop and SQL 8. Data Lifecycle Management 9. Making Development Easier 10. Running a Hadoop Cluster 11. Where to Go Next Index

Getting started

We will now describe the two environments we will use throughout the book: Cloudera's QuickStart virtual machine will be our reference system on which we will show all examples, but we will additionally demonstrate some examples on Amazon's EMR when there is some particularly valuable aspect to running the example in the on-demand service.

Although the examples and code provided are aimed at being as general-purpose and portable as possible, our reference setup, when talking about a local cluster, will be Cloudera running atop CentOS Linux.

For the most part, we will show examples that make use of, or are executed from, a terminal prompt. Although Hadoop's graphical interfaces have improved significantly over the years (for example, the excellent HUE and Cloudera Manager), when it comes to development, automation, and programmatic access to the system, the command line is still the most powerful tool for the job.

All examples and source code presented in this book can be downloaded from https://github.com/learninghadoop2/book-examples. In addition, we have a home page for the book where we will publish updates and related material at http://learninghadoop2.com.

Cloudera QuickStart VM

One of the advantages of Hadoop distributions is that they give access to easy-to-install, packaged software. Cloudera takes this one step further and provides a freely downloadable Virtual Machine instance of its latest distribution, known as the CDH QuickStart VM, deployed on top of CentOS Linux.

In the remaining parts of this book, we will use the CDH5.0.0 VM as the reference and baseline system to run examples and source code. Images of the VM are available for VMware (http://www.vmware.com/nl/products/player/), KVM (http://www.linux-kvm.org/page/Main_Page), and VirtualBox (https://www.virtualbox.org/) virtualization systems.

Amazon EMR

Before using Elastic MapReduce, we need to set up an AWS account and register it with the necessary services.

Creating an AWS account

Amazon has integrated its general accounts with AWS, which means that, if you already have an account for any of the Amazon retail websites, this is the only account you will need to use AWS services.

Note

Note that AWS services have a cost; you will need an active credit card associated with the account to which charges can be made.

If you require a new Amazon account, go to http://aws.amazon.com, select Create a new AWS account, and follow the prompts. Amazon has added a free tier for some services, so you might find that in the early days of testing and exploration, you are keeping many of your activities within the noncharged tier. The scope of the free tier has been expanding, so make sure you know what you will and won't be charged for.

Signing up for the necessary services

Once you have an Amazon account, you will need to register it for use with the required AWS services, that is, Simple Storage Service (S3), Elastic Compute Cloud (EC2), and Elastic MapReduce. There is no cost to simply sign up to any AWS service; the process just makes the service available to your account.

Go to the S3, EC2, and EMR pages linked from http://aws.amazon.com, click on the Sign up button on each page, and then follow the prompts.

Using Elastic MapReduce

Having created an account with AWS and registered all the required services, we can proceed to configure programmatic access to EMR.

Getting Hadoop up and running

Note

Caution! This costs real money!

Before going any further, it is critical to understand that use of AWS services will incur charges that will appear on the credit card associated with your Amazon account. Most of the charges are quite small and increase with the amount of infrastructure consumed; storing 10 GB of data in S3 costs 10 times more than 1 GB, and running 20 EC2 instances costs 20 times as much as a single one. There are tiered cost models, so the actual costs tend to have smaller marginal increases at higher levels. But you should read carefully through the pricing sections for each service before using any of them. Note also that currently data transfer out of AWS services, such as EC2 and S3, is chargeable, but data transfer between services is not. This means it is often most cost-effective to carefully design your use of AWS to keep data within AWS through as much of the data processing as possible. For information regarding AWS and EMR, consult http://aws.amazon.com/elasticmapreduce/#pricing.

How to use EMR

Amazon provides both web and command-line interfaces to EMR. Both interfaces are just a frontend to the very same system; a cluster created with the command-line interface can be inspected and managed with the web tools and vice-versa.

For the most part, we will be using the command-line tools to create and manage clusters programmatically and will fall back on the web interface cases where it makes sense to do so.

AWS credentials

Before using either programmatic or command-line tools, we need to look at how an account holder authenticates to AWS to make such requests.

Each AWS account has several identifiers, such as the following, that are used when accessing the various services:

  • Account ID: each AWS account has a numeric ID.
  • Access key: the associated access key is used to identify the account making the request.
  • Secret access key: the partner to the access key is the secret access key. The access key is not a secret and could be exposed in service requests, but the secret access key is what you use to validate yourself as the account owner. Treat it like your credit card.
  • Key pairs: these are the key pairs used to log in to EC2 hosts. It is possible to either generate public/private key pairs within EC2 or to import externally generated keys into the system.

User credentials and permissions are managed via a web service called Identity and Access Management (IAM), which you need to sign up to in order to obtain access and secret keys.

If this sounds confusing, it's because it is, at least at first. When using a tool to access an AWS service, there's usually the single, upfront step of adding the right credentials to a configured file, and then everything just works. However, if you do decide to explore programmatic or command-line tools, it will be worth investing a little time to read the documentation for each service to understand how its security works. More information on creating an AWS account and obtaining access credentials can be found at http://docs.aws.amazon.com/iam.

The AWS command-line interface

Each AWS service historically had its own set of command-line tools. Recently though, Amazon has created a single, unified command-line tool that allows access to most services. The Amazon CLI can be found at http://aws.amazon.com/cli.

It can be installed from a tarball or via the pip or easy_install package managers.

On the CDH QuickStart VM, we can install awscli using the following command:

$ pip install awscli

In order to access the API, we need to configure the software to authenticate to AWS using our access and secret keys.

This is also a good moment to set up an EC2 key pair by following the instructions provided at https://console.aws.amazon.com/ec2/home?region=us-east-1#c=EC2&s=KeyPairs.

Although a key pair is not strictly necessary to run an EMR cluster, it will give us the capability to remotely log in to the master node and gain low-level access to the cluster.

The following command will guide you through a series of configuration steps and store the resulting configuration in the .aws/credential file:

$ aws configure

Once the CLI is configured, we can query AWS with aws <service> <arguments>. To create and query an S3 bucket use something like the following command. Note that S3 buckets need to be globally unique across all AWS accounts, so most common names, such as s3://mybucket, will not be available:

$ aws s3 mb s3://learninghadoop2
$ aws s3 ls

We can provision an EMR cluster with five m1.xlarge nodes using the following commands:

$ aws emr create-cluster --name "EMR cluster" \
--ami-version 3.2.0 \
--instance-type m1.xlarge  \
--instance-count 5 \
--log-uri s3://learninghadoop2/emr-logs

Where --ami-version is the ID of an Amazon Machine Image template (http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/AMIs.html), and --log-uri instructs EMR to collect logs and store them in the learninghadoop2 S3 bucket.

Note

If you did not specify a default region when setting up the AWS CLI, then you will also have to add one to most EMR commands in the AWS CLI using the --region argument; for example, --region eu-west-1 is run to use the EU Ireland region. You can find details of all available AWS regions at http://docs.aws.amazon.com/general/latest/gr/rande.html.

We can submit workflows by adding steps to a running cluster using the following command:

$ aws emr add-steps --cluster-id <cluster> --steps <steps> 

To terminate the cluster, use the following command line:

$ aws emr terminate-clusters --cluster-id <cluster>

In later chapters, we will show you how to add steps to execute MapReduce jobs and Pig scripts.

More information on using the AWS CLI can be found at http://docs.aws.amazon.com/ElasticMapReduce/latest/DeveloperGuide/emr-manage.html.

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime