Do you have a Power BI dataset that gets data from a dataflow? have you ever thought; “can I get the dataset refreshed only after the refresh of dataflow completed and was successful?” The answer to this question is yes, you can. One of the recent updates from the data integration team of Power BI made this available for you. Let’s see in this blog and video, how this is possible.
If you are using both dataflows and datasets in your Power BI architecture, then your datasets are very likely getting part of their data from Power BI dataflows. It would be great if you can get the Power BI dataset refreshed right after a successful refresh of the dataflow. In fact, you can do a scenario like below.
Power Automate connector for dataflow
Power Automate recently announced availability of a connector that allows you to trigger a flow when a dataflow refresh completes.
Choosing the dataflow
You can then choose the workspace (or environment if you are using Power Platform dataflows), and the dataflow.
Condition on success or fail
The dataflow refresh can succeed or fail. You can choose the proper action in each case. For doing this, you can choose the result of refresh to be Success.
Refresh Power BI dataset
In the event of successful refresh of the dataflow, you can then run the refresh of the Power BI dataset.
Refreshing Power BI dataset through Power Automate is an ability that we had for sometime in the service.
Capture the failure
You can also capture the failure details and send a notification (or you can add a record in a database table for further log reporting) in the case of failure.
The overall flow seems a really simple but effective control of the refresh as you can see below.
Making sure that the refresh of the dataset happens after the refresh of the dataflow, was one of the challenges of Power BI developers if they use dataflow. Now, using this simple functionality, you can get the refresh process streamlined all the way from the dataflow.
Dataflow refresh can be done as a task in the Power Automate as well. Which might be useful for some scenarios, such as running the refresh of the dataflow after a certain event.
This is not only good for refreshing the Power BI dataset after the dataflow, it is also good for refreshing a dataflow after the other one. Especially in best practice scenarios of dataflow, I always recommend having layers of the dataflow for staging, data transformation, etc, as I explained in the below article.
Although, dataflow is not a replacement for the data warehouses. However, having features like this helps the usability and the adoption of this great transformation service.
Do you think of any scenarios that you use this for? let me know in the comments below, I’d love to hear about your scenarios.