October 27, 2022

How to Get Activity Data From Power BI Service with REST APIs

By Ahmed Ansari

In this blog, we’ll cover the importance of gathering event activity data, and how to utilize the REST APIs effectively to do so. This process will entail everything from understanding all of the different operations within the Power BI REST APIs and all the ways to configuring your Azure Data Factory Studio to pull the data.

Before we begin, here is a basic review of terminology that will be used frequently in this article:

  • Power BI REST APIs
  • API
  • Service Principal
  • Azure Data Factory 

What are Power BI REST APIs?

Power BI REST APIs provide service endpoints for users to be able to embed, administer, govern, and perform user functions across the tenant.

Utilizing some of these Power BI REST APIs allow an organization to elevate their Power BI Environment past some of the limitations that a premade report may have, such as historical reporting past 30 days or dataset lineage. Very often, you will see that these Power BI REST APIs add tremendous value to any organization that is looking to streamline their operations. 

What is an API?

API stands for Application Programming Interface. This is a type of software interface that allows two applications to speak to each other. To put it simply, this is a means for you to pass and get information through a system without having to look at the internal workings.

What is a Service Principal?

A service principal is an identity created within Azure to be able to access Azure resources such as applications, tools, and hosted services. To learn more about creating a service principal please refer to the documentation created by Microsoft. 

In essence, what this document will walk you through is how to use the Azure portal to create an application registration which you can then use to get an application ID and create a client secret which you will need for a step later in this article.

What is Azure Data Factory?

Azure Data Factory is a tool used to create an ETL process that calls upon your Power BI REST API to GET the data from which you will store in a source of your choice. We used the Snowflake Data Cloud, however you are more than welcome to copy it to any source of your choice. 

Be sure to stay tune for our upcoming blog on how to write data to Snowflake from Azure Data Factory!

Prerequisites

Before we begin talking about how to get your activity data from the Power BI Service, there are a few prerequisites to ensuring you are able to configure this for your organization.

You will need:

  • A Power BI Administrator Role 
  • Contributor Permissions on Azure Data Factory
  • Tenant ID
  • Service Principal Application ID
  • Client Secret Value

Before you begin you will need to go to the Power BI Admin Portal and enable the following:

Activity Data

Ensure that you are enabling it only for a specific security group using your Azure Active Directory.

How to Get Activity Data From Power BI Service

To begin pulling your data, you will need to make your way to the Microsoft Power BI Service REST API Documentation. From here, you can theoretically work to utilize any of the Power BI Service REST APIs, however we are going to be specifically addressing the Get Activity Events link.

Setting up your Linked Services

To begin configuring your ADF Pipeline, you will need to set-up your linked services which will serve as your Source and Sink for your pipeline. To do so launch your Azure Data Factory studio and select the buttons in the order shown below:

Step 1:

activity-data

Setting up your Source (REST):

Step 2:

values

After clicking on the briefcase icon with the wrench in it, click on New. Then you will type in or locate REST as that will be your source for the dataset. After you select Continue, you will fill in all of the information and click on Test Connection (Located on the bottom right). Once your test connection passes successfully, you will click on Save to confirm your credentials.

Setting Up Your Sink (Snowflake):

Repeat Step 1:

Step 3: Search your data store, this may be any sink you would like set-up. In our case we went with Snowflake.

*Reminder: Your Snowflake must be hosted on Azure for this to be successful*

Setting up your Sink

If you are not familiar where to find this information within Snowflake you can refer to the image below:

azure-snowflake

Setting up your Azure Data Factory Pipeline

To begin setting up your Azure Data Factory pipeline, you will need to go into Azure Data Factory studio and select the author button as shown below:

Once selected you will create a new pipeline and build out your pipeline.

Building Your ForEach Activity.

After bringing in your ForEach Activity, click outside of it to be able to fill in your parameters. After filling in your parameters with name, Type Int, and how many days back you would like to pull, you can select the ForEach activity and it will change the display as pictured below:

steps

Underneath settings you will click on the “Add dynamic content [Alt+Shift+D]” and enter in “@range(1,pipeline().parameters.NameOfYourChoice)” Remember that “NameOfYourChoice will be what you labeled your parameter as in the prior step.

After this is completed you will move into the next step by selecting the Pencil icon within the Activities tab as shown below:

Building your Copy Activity

Building your Copy Activity

Once you select the icon you will be moved to a blank canvas, where you will bring in the Copy Data activity. Feel free to name it as you wish.

Setting up your Source

Setting Up Your Source (REST API)

After naming your Activity, you can then proceed to the “Source” tab.

copy-data1

Within the steps for “Source” you need to:

  1. Select your Linked Service (REST)
  2. Add in  activityevents?startDateTime=’@{formatDateTime(getPastTime(item(), ‘Day’),’yyyy-MM-dd’)}T00:00:00.000Z’&endDateTime=’@{formatDateTime(getPastTime(item(), ‘Day’), ‘yyyy-MM-dd’)}T23:59:59.999Z Where it says “AdminAPI
  3. Underneath “Pagination Rules” add an AbsoluteUrl with the Body of “continuationUri”</

 

These are all of the steps you need to set up your Source”.

Setting Up Your Sink (Snowflake)

Next we will walk through the steps to set up your “Sink” tab.

Within the steps for “Sink” you need to:

  1. Select your Sink Dataset
  2. *If you do not have a sink dataset* Select the “Open” Pencil Icon
  3. Select your Linked Service that you set up earlier
  4. Select the table you would like to load the data into

Final Setup (Copy Activities)

If you are using Snowflake as your “Sink” Dataset, you will also need to enable staging and connect it to your Azure Blob Storage. As well as add: Source –  @{concat(‘activityevents?startDateTime=”’, formatDateTime(adddays(utcnow(), -7), ‘yyyy-MM-dd’), ‘T00:00:00.000Z”&endDateTime=”’, formatDateTime(adddays(utcnow(), -7), ‘yyyy-MM-dd’), ‘T23:59:59.999Z”’)}  underneath your “User Properties” tab.

You will also need to map your dataset. You can learn more on how to do that in the Writing Data to Snowflake from Azure Data Factory Blog

Testing/Automating your Pipeline

To wrap, you can test out your pipeline by hitting the “Debug” button and selecting the number of previous days you would like to bring in.

To automate your pipeline, you can set up a trigger with your desired times to refresh. Once daily is what is recommended to pull the prior day’s data.

set up a trigger

Using the current limitation for default usage reporting (30 days), you can now build out Report Adoption Metrics, Dataset Lineage, and more! 

Closing

Congratulations! You should now be able to capture, store, and build reports on your Power BI Event Activity Metrics for over 30 days! 

For more Power BI Service REST APIs you can refer to the chart below:

test-group

FAQ

Question: Why is this beneficial to my organization?

Answer: Most organizations spend a lot of time, money and effort building out custom reports, dashboard and analytics only to often see it fall through the cracks. When you have a report adoption monitor you will be able to see which reports are actively consumed, by which business units, and more!

.

Data Coach is our premium analytics training program with one-on-one coaching from renowned experts.

Accelerate and automate your data projects with the phData Toolkit