November 22, 2022

How to Write Data to Snowflake From Azure Data Factory

By Ahmed Ansari

Like many other ETL tools in the market, Azure Data Factory is just one that allows you to bring in data into your reporting environment. Azure Data Factory can be a powerful tool to use for businesses with pre-existing environments that already have it configured and for organizations familiar with the Microsoft Ecosystem.

In this blog, we will cover how to write data from Azure Data Factory, set up your Azure Blob Storage, and how to create a table within the Snowflake Data Cloud so you can map your fields to the table. 

Before we begin, some terms that will be expanded on further will be Azure Data Factory, Service Principal, and Azure Blob Storage. 

What is a Service Principal?

A service principal is an identity created within Azure to be able to access Azure resources such as applications, tools, and hosted services. To learn more about creating a service principal please refer to the documentation created by Microsoft. 

In essence, what this document will walk you through is how to use the Azure portal to create an application registration which you can use to get an application ID and create a client secret which you will need for a step later in this blog.

What is Azure Data Factory?

The purpose of Azure Data Factory in this blog is to use it as a tool to create an ETL process for you to call your Power BI REST API and GET the data from it which you will then store in a source of your choice. For our choice we used Snowflake, however you are more than welcome to copy it to your source of choice. 

What is Azure Blob Storage?

Azure Blob Storage will allow you to store and access your unstructured data at scale. What this means for this blog is that you will be connecting your Azure Blob Storage as an interim staging store. The Copy Activity you will use within your Azure Data Factory Pipeline will automatically manage the flow from staging to source for you.

Prerequisites

Before we begin talking about how to copy your data using Azure Data Factory, there are a few prerequisites you need to have set up to ensure you are able to configure this for your organization. You will need:

How to Copy Data to Snowflake

To begin, we will go through some core steps that have been covered in the Getting Activity Data from Power BI Service with REST APIs blog. All limitations around Direct Copy to Snowflake are listed in the Microsoft Copy and Transform data in Snowflake Document.

Step 1: Setting up Your Linked Services

To start configuring your ADF Pipeline, you will first need to set-up your linked services, which will serve as your Source and Sink for your pipeline. For this blog, we use pre-existing connections that have already been discussed, but you can feel free to use whichever source you would like for your Source connection. 

To do so launch your Azure Data Factory Studio and select the buttons in the order shown below:

Azure Data Factory Studio Launch

Step 2: Setting up your Source (REST):

Setting up your Source (REST)
Edit (REST)

After clicking on the briefcase icon with the wrench in it, click on NEW. Then you will type in or locate REST as that will be your source for the dataset. After you select Continue, you will fill in all of the information and click on Test Connection (Located on the Bottom right.) Once your test connection passes successfully, you will click on SAVE to confirm your credentials.

Step 3: Setting up your Sink (Snowflake):

Repeat Step 1 and then Search up your data store, this may be any sink you would like set-up. In our case, we went with Snowflake.

*Reminder: Your Snowflake must be hosted on Azure for this to succeed* 

new-linked-service
edit-linked-service

If you are not familiar with where to find this information within Snowflake, you can refer to the image below:

information within Snowflake

Step 4: Setting up your Blob Storage for Staging

Repeat Step 1 and set-up you Blob Storage

edit-blob-storage

Step 5: Building Your Copy Activity

Bring in the Copy Data Activity. Feel free to name it as you wish.

copy-data

Connecting Your Source

After naming your Activity, you can then proceed to the “Source” tab.

naming your Activity

Within the steps for “Source” you need to:

  1. Select your Linked Service that is going to be your Source dataset
  2. Proceed with adding in the required information for it.

Connecting your Sink (Snowflake)

Next we will walk through the steps to set up your “Sink” tab.

set-up-sink

Within the steps for “Sink” you need to:

  1. Select your Sink Dataset
  2. *If you do not have a sink dataset* Select the “Open” Pencil Icon
  3. Select your Linked Service that you set-up earlier
  4. Select the table you would like to load the data into
  5. *If you do not have a table within Snowflake* You can create one for your dataset.

Mapping Your Fields

Once you have set-up your source table. You will need to map all fields into your ADF pipeline and you can do so by navigating to the “mapping” button. 

After this step you will be mapping all fields as shown below: 

mapping-fields

Once you have mapped all of your fields you can proceed to the “Settings” tab.

Settings (Copying to Blob Storage Staging)

Once you are in your settings you will connect the Blob Storage Linked Service you created earlier in this blog and use the storage path of your choice. 

Blob Storage Linked Service

Ensure these settings match otherwise you may run into some unexpected errors. 

Step 6: Testing/Automating Your Pipeline

To wrap, you can test out your pipeline by hitting the “Debug” button and selecting the number of previous days you would like to bring in.

debug

To automate your pipeline, you can set up a trigger with your desired times to refresh. Once daily is what is recommended to pull the prior day’s data.

edit-trigger

Limitations

Please make sure to keep these limitations in mind when creating your Pipelines

limitations

Closing

Congratulations! You have successfully set up your connection from ADF to Snowflake! You can now directly connect your data to feed it to Snowflake and analyze it further from there. If you would like to automate your pipeline please proceed to the next steps.

For more information on building out your ADF Pipelines to ingest your data and feed it to Snowflake or questions on building out pipelines using ADF for your organization, please reach out to us.

Data Coach is our premium analytics training program with one-on-one coaching from renowned experts.

Accelerate and automate your data projects with the phData Toolkit