Writing in Azure SQL Server
So far, we have read data from the internet and stored it in a Delta Lake table. Delta tables are fine as a consumption layer, but they have some caveats:
- We need to have a cluster running to query the data. This can be costly at times.
- Queries take longer than a regular database to get back because the cluster uses a distributed process: driver to workers – especially for small volumes.
- There's no native row-level security or dynamic data masking as we have in SQL Server.
- There are no schemas, only databases and tables. This might be an issue for certain applications.
Our ETLInAzure.StateIncome
table holds only 52 rows. We have used Databricks to import the data from the internet and stored it in Delta Lake. To use it in our applicative database – AdventureWorksLT
– we will copy the transformed data back to SQL Server.
In this recipe, we will save our ETLInAzure.StateIncome
table created in the previous...