Copying data from Google BigQuery to Azure Data Lake Store
In this recipe, we will use Azure Data Factory to import a subset of a public fdic_banks.locations
dataset from the Google BigQuery service (a cloud data warehouse) into an Azure Data Lake store. We will write the data into destination storage in Parquet format for convenience.
Getting ready
For this recipe, we assume that you have a Google Cloud account and a project, as well as an Azure account and a Data Lake storage account (ADLS Gen2). The following is a list of additional preparatory work:
- You need to enable the BigQuery API for your Google Cloud project. You can enable this API here: https://console.developers.google.com/apis/api/bigquery.googleapis.com/overview.
- You will require information for the Project ID, Client ID, Client Secret, and Refresh Token fields for the BigQuery API app. If you are not familiar on how to set up a Google Cloud app and obtain these tokens, you can find detailed instructions...