Search icon CANCEL
Subscription
0
Cart icon
Cart
Close icon
You have no products in your basket yet
Save more on your purchases!
Savings automatically calculated. No voucher code required
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Modern Data Architectures with Python

You're reading from  Modern Data Architectures with Python

Product type Book
Published in Sep 2023
Publisher Packt
ISBN-13 9781801070492
Pages 318 pages
Edition 1st Edition
Languages
Author (1):
Brian Lipp Brian Lipp
Profile icon Brian Lipp
Toc

Table of Contents (19) Chapters close

Preface 1. Part 1:Fundamental Data Knowledge
2. Chapter 1: Modern Data Processing Architecture 3. Chapter 2: Understanding Data Analytics 4. Part 2: Data Engineering Toolset
5. Chapter 3: Apache Spark Deep Dive 6. Chapter 4: Batch and Stream Data Processing Using PySpark 7. Chapter 5: Streaming Data with Kafka 8. Part 3:Modernizing the Data Platform
9. Chapter 6: MLOps 10. Chapter 7: Data and Information Visualization 11. Chapter 8: Integrating Continous Integration into Your Workflow 12. Chapter 9: Orchestrating Your Data Workflows 13. Part 4:Hands-on Project
14. Chapter 10: Data Governance 15. Chapter 11: Building out the Groundwork 16. Chapter 12: Completing Our Project 17. Index 18. Other Books You May Enjoy

DBX

DBX is a central tool meant for CI workloads when working with Databricks. You can use it to create a project template and deploy and launch your workflows. Since DBX uses Databricks APIs, it is able to use Databricks workflows. A workflow is a grouping of dbt notebooks or jobs meant to flow together.

These are some of the most important files:

  • .dbx/project.json: Organized by environments; used to manage configuration across your project.
  • project_folder: Used to store your Python code that isn’t included in notebooks or tests.
  • conf/deployment.yml: A YAML-based configuration file that allows you to define the details of Databricks workflows. You can define tasks for dbt notebooks and jobs at the moment.
  • notebooks: Used to hold Databricks notebooks.
  • tests: Should be used for integration and unit tests, with each in its own subfolder structure.

Important commands

To create your shell project (not required but useful), run the following command...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €14.99/month. Cancel anytime}