Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hadoop Real-World Solutions Cookbook- Second Edition

You're reading from   Hadoop Real-World Solutions Cookbook- Second Edition Over 90 hands-on recipes to help you learn and master the intricacies of Apache Hadoop 2.X, YARN, Hive, Pig, Oozie, Flume, Sqoop, Apache Spark, and Mahout

Arrow left icon
Product type Paperback
Published in Mar 2016
Publisher
ISBN-13 9781784395506
Length 290 pages
Edition 2nd Edition
Tools
Arrow right icon
Author (1):
Arrow left icon
Tanmay Deshpande Tanmay Deshpande
Author Profile Icon Tanmay Deshpande
Tanmay Deshpande
Arrow right icon
View More author details
Toc

Table of Contents (12) Chapters Close

Preface 1. Getting Started with Hadoop 2.X FREE CHAPTER 2. Exploring HDFS 3. Mastering Map Reduce Programs 4. Data Analysis Using Hive, Pig, and Hbase 5. Advanced Data Analysis Using Hive 6. Data Import/Export Using Sqoop and Flume 7. Automation of Hadoop Tasks Using Oozie 8. Machine Learning and Predictive Analytics Using Mahout and R 9. Integration with Apache Spark 10. Hadoop Use Cases Index

Importing data into Hive tables using Sqoop


Till now we have seen how to import data into HDFS folders. Now it's time to understand how to import data directly into Hive table.

Getting ready

To perform this recipe, you should have a Hadoop cluster running with you as well as the latest version of Sqoop installed on it. Here I am using Sqoop 1.4.6. We would also need a MySQL database to be present in the network. Installing Sqoop is easy; by downloading Sqoop tar ball and setting it in the system path. As we are going to import data from MySQL, we would also need to download MySQL connector. Based on your MySQL version, download the right connector jar and copy it into the lib directory of Sqoop installation.

How to do it...

Sqoop provides us the facility to directly import data into Hive table. This saves our time in creating Hive tables, specifying matching schema, loading data into HDFS, and then creating external Hive table. We can do this in a couple of commands:

  1. The following first command...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime