Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Programming MapReduce with Scalding

You're reading from   Programming MapReduce with Scalding A practical guide to designing, testing, and implementing complex MapReduce applications in Scala

Arrow left icon
Product type Paperback
Published in Jun 2014
Publisher
ISBN-13 9781783287017
Length 148 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Antonios Chalkiopoulos Antonios Chalkiopoulos
Author Profile Icon Antonios Chalkiopoulos
Antonios Chalkiopoulos
Arrow right icon
View More author details
Toc

Table of Contents (11) Chapters Close

Preface 1. Introduction to MapReduce 2. Get Ready for Scalding FREE CHAPTER 3. Scalding by Example 4. Intermediate Examples 5. Scalding Design Patterns 6. Testing and TDD 7. Running Scalding in Production 8. Using External Data Stores 9. Matrix Calculations and Machine Learning Index

The external operations pattern


To achieve modularity and fulfill the single responsibility principle, we can structure our data processing job in an organized way. An object, a trait, and a job can share parts of the responsibilities, explained as follows:

  • In package object, we can store information about the schema of the data

  • In trait, we can store all the external operations

  • In a Scalding job, we can manage arguments, define taps, and use external operations to construct data processing pipelines

A particular dataset will usually be processed by multiple jobs to extract different value from the data. Thus, we can create an object called LogsSchemas to store input and output schemas, and also to document the locations in HDFS, where the data resides. This object can act as a registry of all the variations of datasets, and we can reuse it in any of our Scalding jobs, as shown in the following code:

package object LogsSchemas {
  // that is, hdfs:///logs/raw/YYYY/MM/DD/
  val LOG_SCHEMA = List...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime