Logfile analysis
The results of this data-processing job will be displayed on a web application that presents on an interactive map, the geographic locations where users log in from. This web application will allow filtering data based on the device used.
Our job is to analyze 10 million rows of logs and generate such a report in a JSON file that can drive the web application. Because of the nature of the web application, the maximum size of that file should not exceed a few hundred kilobytes. The challenge here is how to manage data in such a way as to efficiently construct this report.
It is all about the data, and we will be using Scalding to start exploring. Around 10 million rows of data exist in tab-separated files in a Hadoop cluster in the location hdfs:///log-files/YYYY/MM/DD
.
The TSV files contain nine columns of data. We discover that the 'activity
column contains values such as login
, readArticle
, and streamVideo
, and we are interested only in the login
events. Also, if we...