Migrating data from Google BigQuery to Azure Synapse
In this recipe, we will import a public dataset, github_repo.files
, from Google BigQuery into Azure Synapse – formerly Azure Data Warehouse. We will create a SQL data pool, create the table to store our imported data, and configure the pipeline to migrate data from a public dataset hosted at Google BigQuery.
Getting ready
To complete this recipe, you will need a Google Cloud project with the BigQuery API enabled. Refer to the Getting ready section in the previous recipe for instructions on how to set those up and obtain your Project ID, Client ID, Client Secret, and Refresh Token fields.
You will also need an instance of an Azure Synapse SQL pool to import the data. Refer to the chapter on Azure Synapse on how to create and configure a SQL pool. Have the login credentials for this SQL pool to hand.
You will also need to create a table in your database to store the data we import. Download the script to create...