Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Learning Apache Apex

You're reading from   Learning Apache Apex Real-time streaming applications with Apex

Arrow left icon
Product type Paperback
Published in Nov 2017
Publisher
ISBN-13 9781788296403
Length 290 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (5):
Arrow left icon
Munagala V. Ramanath Munagala V. Ramanath
Author Profile Icon Munagala V. Ramanath
Munagala V. Ramanath
David Yan David Yan
Author Profile Icon David Yan
David Yan
Ananth Gundabattula Ananth Gundabattula
Author Profile Icon Ananth Gundabattula
Ananth Gundabattula
Thomas Weise Thomas Weise
Author Profile Icon Thomas Weise
Thomas Weise
Kenneth Knowles Kenneth Knowles
Author Profile Icon Kenneth Knowles
Kenneth Knowles
+1 more Show less
Arrow right icon
View More author details
Toc

Table of Contents (11) Chapters Close

Preface 1. Introduction to Apex FREE CHAPTER 2. Getting Started with Application Development 3. The Apex Library 4. Scalability, Low Latency, and Performance 5. Fault Tolerance and Reliability 6. Example Project – Real-Time Aggregation and Visualization 7. Example Project – Real-Time Ride Service Data Processing 8. Example Project – ETL Using SQL 9. Introduction to Apache Beam 10. The Future of Stream Processing

Running the application on GCP Dataproc


This section will provide a tutorial on how to run the Apex application on a real Hadoop cluster in the cloud. Dataproc (https://cloud.google.com/dataproc/) is one of several options that exist (Amazon EMR is another one, and the instructions here can be easily adapted to EMR as well).

The general instructions on how to work on a cluster were already covered in Chapter 2, Getting Started with Application Development, where a Docker container was used. This section will focus on the differences of adding Apex to an existing multi-node cluster.

To start with, we are heading over to the GCP console (https://console.cloud.google.com/dataproc/clusters) to create a new cluster.

For better illustration we will use the UI, but these steps can be fully automated using the REST API or command line as well:

  1. The first step is to decide what size of cluster and what type of machines we want. For this example, 3 worker nodes of a small machine type will suffice (for...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime