Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
The Machine Learning Solutions Architect Handbook

You're reading from   The Machine Learning Solutions Architect Handbook Create machine learning platforms to run solutions in an enterprise setting

Arrow left icon
Product type Paperback
Published in Jan 2022
Publisher Packt
ISBN-13 9781801072168
Length 442 pages
Edition 1st Edition
Languages
Arrow right icon
Author (1):
Arrow left icon
David Ping David Ping
Author Profile Icon David Ping
David Ping
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Section 1: Solving Business Challenges with Machine Learning Solution Architecture
2. Chapter 1: Machine Learning and Machine Learning Solutions Architecture FREE CHAPTER 3. Chapter 2: Business Use Cases for Machine Learning 4. Section 2: The Science, Tools, and Infrastructure Platform for Machine Learning
5. Chapter 3: Machine Learning Algorithms 6. Chapter 4: Data Management for Machine Learning 7. Chapter 5: Open Source Machine Learning Libraries 8. Chapter 6: Kubernetes Container Orchestration Infrastructure Management 9. Section 3: Technical Architecture Design and Regulatory Considerations for Enterprise ML Platforms
10. Chapter 7: Open Source Machine Learning Platforms 11. Chapter 8: Building a Data Science Environment Using AWS ML Services 12. Chapter 9: Building an Enterprise ML Architecture with AWS ML Services 13. Chapter 10: Advanced ML Engineering 14. Chapter 11: ML Governance, Bias, Explainability, and Privacy 15. Chapter 12: Building ML Solutions with AWS AI Services 16. Other Books You May Enjoy

Understanding the Apache Spark ML machine learning library

Apache Spark is a distributed data processing framework for large-scale data processing. It allows Spark-based applications to load and process data across a cluster of distributed machines in memory to speed up the processing time.

A Spark cluster consists of a master node and worker nodes for running different Spark applications. Each application that runs in a Spark cluster has a driver program and its own set of processes, which are coordinated by the SparkSession object in the driver program. The SparkSession object in the driver program connects to a cluster manager (for example, Mesos, YARN, Kubernetes, or Spark's standalone cluster manager), which is responsible for allocating resources in the cluster for the Spark application. Specifically, the cluster manager acquires resources on worker nodes called executors to run computations and store data for the Spark application. Executors are configured with resources...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime