Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
ETL with Azure Cookbook

You're reading from   ETL with Azure Cookbook Practical recipes for building modern ETL solutions to load and transform data from any source

Arrow left icon
Product type Paperback
Published in Sep 2020
Publisher Packt
ISBN-13 9781800203310
Length 446 pages
Edition 1st Edition
Languages
Tools
Concepts
Arrow right icon
Authors (3):
Arrow left icon
Christian Cote Christian Cote
Author Profile Icon Christian Cote
Christian Cote
Matija Lah Matija Lah
Author Profile Icon Matija Lah
Matija Lah
Madina Saitakhmetova Madina Saitakhmetova
Author Profile Icon Madina Saitakhmetova
Madina Saitakhmetova
Arrow right icon
View More author details
Toc

Table of Contents (12) Chapters Close

Preface 1. Chapter 1: Getting Started with Azure and SSIS 2019 2. Chapter 2: Introducing ETL FREE CHAPTER 3. Chapter 3: Creating and Using SQL Server 2019 Big Data Clusters 4. Chapter 4: Azure Data Integration 5. Chapter 5: Extending SSIS with Custom Tasks and Transformations 6. Chapter 6: Azure Data Factory 7. Chapter 7: Azure Databricks 8. Chapter 8: SSIS Migration Strategies 9. Chapter 9: Profiling data in Azure 10. Chapter 10: Manage SSIS and Azure Data Factory with Biml 11. Other Books You May Enjoy

Writing in Azure SQL Server

So far, we have read data from the internet and stored it in a Delta Lake table. Delta tables are fine as a consumption layer, but they have some caveats:

  • We need to have a cluster running to query the data. This can be costly at times.
  • Queries take longer than a regular database to get back because the cluster uses a distributed process: driver to workers – especially for small volumes.
  • There's no native row-level security or dynamic data masking as we have in SQL Server.
  • There are no schemas, only databases and tables. This might be an issue for certain applications.

Our ETLInAzure.StateIncome table holds only 52 rows. We have used Databricks to import the data from the internet and stored it in Delta Lake. To use it in our applicative database – AdventureWorksLT – we will copy the transformed data back to SQL Server.

In this recipe, we will save our ETLInAzure.StateIncome table created in the previous...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime