The traditional MapReduce programming requires users to write map and reduction functions as per the specifications of the defined API. However, what if I already have a processing function written, and I want to federate the processing to my own function, still using the MapReduce concept over Hadoop's distributed File System? There is a possibility to solve this with the streaming and pipes functions of Apache Hadoop.
Hadoop streaming allows user to code their logic in any programming language such as C, C++, and Python, and it provides a hook for the custom logic to integrate with traditional MapReduce framework with no or minimal lines of Java code. The Hadoop streaming APIs allow users to run any scripts or executables outside of the traditional Java platform. This capability is similar to Unix's Pipe function (https://en.wikipedia...