Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Real-Time Big Data Analytics

You're reading from   Real-Time Big Data Analytics Design, process, and analyze large sets of complex data in real time

Arrow left icon
Product type Paperback
Published in Feb 2016
Publisher
ISBN-13 9781784391409
Length 326 pages
Edition 1st Edition
Languages
Concepts
Arrow right icon
Author (1):
Arrow left icon
Shilpi Saxena Shilpi Saxena
Author Profile Icon Shilpi Saxena
Shilpi Saxena
Arrow right icon
View More author details
Toc

Table of Contents (12) Chapters Close

Preface 1. Introducing the Big Data Technology Landscape and Analytics Platform FREE CHAPTER 2. Getting Acquainted with Storm 3. Processing Data with Storm 4. Introduction to Trident and Optimizing Storm Performance 5. Getting Acquainted with Kinesis 6. Getting Acquainted with Spark 7. Programming with RDDs 8. SQL Query Engine for Spark – Spark SQL 9. Analysis of Streaming Data Using Spark Streaming 10. Introducing Lambda Architecture Index

Working with Parquet


In this section, we will discuss and talk about various operations provided by Spark SQL for working with Parquet data formats with appropriate examples.

Parquet is one of popular columnar data storage format for storing the structured data. Parquet leverages the record shredding and assembly algorithm (http://tinyurl.com/p8kaawg) as described in the Dremel paper (http://research.google.com/pubs/pub36632.html). Parquet supports efficient compression and encoding schemes which is better than just simple flattening of structured tables. Refer to https://parquet.apache.org/ for more information on the Parquet data format.

The DataFrame API of Spark SQL provides convenience operations for writing and reading data in the Parquet format. We can persist Parquet tables as temporary tables within Spark SQL and perform all other operations provided by the DataFrame API for data manipulation or analysis.

Let's see the example for writing/reading Parquet data formats and then we will...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image