Adding commands that talk to HDFS for deployment in Karaf
As HDFS at its core is a filesystem, let's see how we can access that with the standard tools and the bundle we've been building up so far.
What we'll do is store one level of configuration files from our running Karaf container into HDFS. Then, we'll provide a second command to read the files back.
We've learned how to build a feature for Hadoop that takes care of all the various dependencies needed to talk to HDFS, and we have also jumped a little bit ahead and discussed classloading and a few tricks to get the Hadoop libraries we deployed to cooperate. We are now at a point where we can start writing code against Hadoop using the libraries provided.
Getting ready
The ingredients of this recipe include the Apache Karaf distribution kit, access to JDK, and Internet connectivity. The sample code for this recipe is available at https://github.com/jgoodyear/ApacheKarafCookbook/tree/master/chapter9/chapter-9-recipe1. Remember, you need both...