Hadoop architecture
Apache Hadoop is an open source, integrated framework for Big Data processing and management, which can be relatively easy to deploy on commodity hardware. Hadoop can also be defined as an ecosystem of tools and methods that allow distributed storage and analytics of massive amounts of structured and unstructured data. In this section, we will present an array of tools, frameworks, and applications that come as integral parts of the Hadoop ecosystem and are responsible for a variety of data management and processing purposes.
Hadoop Distributed File System
As explained in Chapter 1, The Era of Big Data, Hadoop Distributed File System (HDFS) derives from the original Google File System presented in 2003 in a paper titled The Google file system authored by Ghemawat, Gobioff, and Leung. The architecture and design of current HDFS (based on the Apache Hadoop 2.7.2 release) are explained thoroughly in the HDFS Architecture Guide available at the Apache website at http://hadoop...