Time for action – exporting data from MySQL to HDFS
We'll use a straightforward example here, where we just pull all the data from a single MySQL table and write it to a single file on HDFS.
Run Sqoop to export data from MySQL onto HDFS:
$ sqoop import --connect jdbc:mysql://10.0.0.100/hadooptest --username hadoopuser \ > --password password --table employees
Examine the output directory:
$ hadoop fs -ls employees
You will receive the following response:
Found 6 items -rw-r--r-- 3 hadoop supergroup 0 2012-05-21 04:10 /user/hadoop/employees/_SUCCESS drwxr-xr-x - hadoop supergroup 0 2012-05-21 04:10 /user/hadoop/employees/_logs -rw-r--r-- 3 … /user/hadoop/employees/part-m-00000 -rw-r--r-- 3 … /user/hadoop/employees/part-m-00001 -rw-r--r-- 3 … /user/hadoop/employees/part-m-00002 -rw-r--r-- 3 … /user/hadoop/employees/part-m-00003
Display one of the result files:
$ hadoop fs -cat /user/hadoop/employees/part-m-00001
You will see the following output:
Bob,Sales...