Time for action – importing data from Hadoop into MySQL
Let's demonstrate this by importing data into a MySQL table from an HDFS file.
Create a tab-separated file named
newemployees.tsv
with the following entries:Frances Operations 34000 2012-03-01 Greg Engineering 60000 2003-11-18 Harry Intern 22000 2012-05-15 Iris Executive 80000 2001-04-08 Jan Support 28500 2009-03-30
Create a new directory on HDFS and copy the file into it:
$hadoop fs -mkdir edata $ hadoop fs -put newemployees.tsv edata/newemployees.tsv
Confirm the current number of records in the employee table:
$ echo "select count(*) from employees" | mysql –u hadoopuser –p hadooptest
You will receive the following response:
Enter password: count(*) 5
Run a Sqoop export:
$ sqoop export --connect jdbc:mysql://10.0.0.100/hadooptest --username hadoopuser -P --table employees --export-dir edata --input-fields-terminated-by '\t'
You will receive the following response:
12/05/27 07:52:22 INFO mapreduce.ExportJobBase...