Power Platform Dataflows instead of Power BI Dataflow Gen1

Power Platform Dataflow: A Practical Replacement for Power BI Dataflow Gen1

Microsoft has announced that Power BI Dataflow Gen1 (analytical dataflow) is being retired. If your organization is not ready to move to Microsoft Fabric — whether due to budget constraints or strategic timing — you still have a solid option: Power Platform Dataflow, also known as Standard Dataflow, created inside the Power Apps portal.

In this article, I’m going to show you exactly how Power Platform Dataflow works, how it compares to Power BI Dataflow Gen1, and how you can use it as a cost-effective bridge solution.

🎬 Watch the full walkthrough: Power Platform Dataflow as a Gen1 Replacement


Why Is Power BI Dataflow Gen1 Being Retired?

Microsoft’s Group Product Manager, Miguel Llopis, explained the reasoning in the official blog announcement. Dataflow Gen2 — especially with CI/CD support — is significantly easier to maintain from Microsoft’s perspective and offers far more capabilities when paired with a Fabric capacity. This is the natural evolution of the product.

That said, the announcement is specifically about disabling the creation of new Dataflow Gen1 instances. Your existing dataflows will continue to be supported. The change affects only your ability to create net-new Gen1 dataflows going forward.

🎬 Watch: Microsoft’s announcement and what it means — 0:00


What Are Your Options After Gen1 Retirement?

You have a few realistic paths:

  • Move to Microsoft Fabric (Dataflow Gen2). This is the best long-term move. If Fabric was already on your roadmap, now is the right time to accelerate that migration. If you are already paying for Azure SQL Database or Azure Data Factory, the cost delta to an F2 Fabric capacity (or a reserved instance) may be smaller than you expect.
  • Use Power Platform Dataflow (Standard Dataflow). If moving to Fabric is not on the table yet — budget is tight, or the organization is not ready — this is your practical alternative. It requires only an Office 365 license for Power Apps, which is typically included in your existing Microsoft 365 subscription.

This article focuses on that second path.

🎬 Watch: Exploring your options after Gen1 retirement — 0:50


Understanding the Dataflow Landscape: Gen1 Versions Explained

I have a separate video that covers the differences between all dataflow versions in full detail — watch it here. For this article, here’s the quick version you need to know.

Before going further, it helps to understand the two flavors of Dataflow Gen1:

Analytical Dataflow — this is the one built inside Power BI Service. It uses AI functions, writes output as CSV files into Azure Data Lake Storage Gen2, and is the one being retired.

Standard Dataflow — this is the one built inside the Power Apps / Power Platform portal. It writes data into Dataverse (the Common Data Model database behind Power Apps) and uses the same Power Query transformation engine you already know from Power BI.

Standard Dataflow itself comes in two versions: V1 and V2. Today, when you create a new dataflow in Power Platform, you are creating a Standard V2 dataflow. That is exactly what we are going to use.

🎬 Watch: Analytical vs Standard Dataflow explained — 1:53


How to Create a Power Platform Dataflow

Step 1: Go to the Power Apps Portal

Navigate to make.powerapps.com. This is your Power Platform portal — the same place you manage Power Apps, Dataverse tables, and now your dataflows.

Inside the portal, look for the Dataflows section in the left navigation. If you don’t see it, click More and pin it to your navigation.

🎬 Watch: Navigating to Power Platform portal and finding Dataflows — 4:03

Step 2: Create a New Dataflow

Click New dataflow and give it a meaningful name. When creating the dataflow, you will be asked whether you want a Standard or Analytical type.

More than 90% of real-world dataflows don’t need the AI functions that analytical mode provides. Go with Standard.

Standard gives you the same Power Query transformation experience — merges, appends, column operations, data type changes — without any extra complexity.

🎬 Watch: Creating a new Standard Dataflow and choosing Standard vs Analytical — 5:44

Step 3: Connect to Your Data Source

Power Platform Dataflow supports the same data sources as Power BI Dataflow Gen1. In my demo, I connected to an Excel file via URL. The experience is identical — paste the URL, select your tables, click Transform Data.

This opens the Power Query Online editor — the same editor you use in Power BI dataflows. You get the same diagram view (click the diagram view icon in the bottom-right corner if it’s not showing by default), the same merge and append operations, and the same M code under the hood.

🎬 Watch: Connecting to Excel and opening Power Query Online — 6:27

Step 4: Transform Your Data

Let’s say you want to merge a Product table with its Subcategory and Category tables to produce a flat, denormalized product dimension. The process is exactly what you’d do in Power BI:

  1. Merge as New — join Product with DimProductSubcategory on the subcategory key.
  2. Select only the columns you need; remove the rest.
  3. Expand the structured column from the subcategory table to pull in the subcategory name and category key.
  4. Merge again — join the result with DimProductCategory on the category key.
  5. Select English category name, remove the category key column, rename the query to Product.
  6. For intermediate tables (like DimProduct, DimProductCategory, DimProductSubcategory) that you only used as lookup references — disable their load. This is important. Only tables you want written to Dataverse should have load enabled.

It is not recommended to leave intermediate staging tables with load enabled — they will create unnecessary tables in Dataverse and clutter your environment.

🎬 Watch: Merging tables and disabling load on intermediate queries — 7:35

Step 5: Set the Destination (Key Difference from Gen1)

Here is one of the most important differences between Standard Dataflow and Analytical Dataflow Gen1.

In Gen1, you had no control over where the data lands — it automatically wrote CSV files to Azure Data Lake Storage Gen2. In Power Platform Dataflow, you explicitly map each table to a destination in Dataverse.

For each table you want to load, you can:

  • Load into a new Dataverse table (with a custom name)
  • Load into an existing Dataverse table
  • Set column mappings between your query output and the Dataverse table columns

This is actually more flexibility than Gen1 gave you, not less.

🎬 Watch: Setting the Dataverse destination for each table — 9:53

Step 6: Set the Refresh Schedule and Publish

After configuring your destination mappings, you set the refresh schedule. You can refresh manually or on an automated schedule. Note that the maximum refresh frequency is 48 times per day — so every 30 minutes is the minimum interval, not every minute.

Once published, your dataflow shows up in the list with type Standard V2.

🎬 Watch: Setting the refresh schedule and publishing — 11:00


Accessing the Data in Power BI Desktop

Once your dataflow has refreshed and written data into Dataverse, you have two ways to pull it into Power BI Desktop:

Option 1: Dataflow Connector

Go to Get Data → Dataflow. After authenticating, you will see two options: Workspaces and Environments.

  • Workspaces = Power BI / Fabric dataflows
  • Environments = Power Platform / Power Apps dataflows ← this is what you want

Select your environment, find your dataflow, and select the tables.

🎬 Watch: Using the Dataflow connector — Environments vs Workspaces — 12:25

Option 2: Dataverse Connector

Go to Get Data → Power Platform → Dataverse. This connects directly to the Dataverse tables where your dataflow wrote its output.

Be aware: Dataverse automatically adds many system columns to every table — created on, created by, modified on, modified by, and more. These are useful for operational apps but are noise for analytics.

Before loading, always click Transform Data and use Choose Columns to select only the columns you actually created. Your custom columns will have a consistent prefix (a 5-character environment prefix like cre13_). Filter by that prefix and you’ll find your columns quickly. You can also use a simple M transformation to strip that prefix from column names if you want cleaner field names in your model.

🎬 Watch: Connecting via Dataverse and filtering out system columns — 14:08


What You Do NOT Lose by Moving to Power Platform Dataflow

A few things worth calling out explicitly:

  • Incremental refresh is supported. The same incremental refresh capability available in Gen1 is available in Power Platform Standard Dataflow. Configure it per table as needed.
  • Power Query template migration is supported. Export your existing Gen1 dataflow as a .pqt (Power Query Template) file, then import that template when creating a new Power Platform dataflow. This is the fastest migration path.
  • Same transformation engine. Merges, appends, fuzzy matching, custom M code — it’s all there.

🎬 Watch: Incremental refresh and Power Query template migration — 15:22


Licensing: What Do You Actually Need?

For Power Platform Dataflow, you need a Power Apps for Office 365 license. In most organizations, this is already included in your Microsoft 365 subscription.

You do not need:

  • Power BI Pro or Premium
  • Microsoft Fabric capacity
  • A paid standalone Power Apps license (for standard dataflow use)

This makes Power Platform Dataflow one of the lowest-cost options available for running scheduled data transformation workloads today.

🎬 Watch: Licensing explained — Power Apps for Office 365 — 11:53


Summary

Power BI Dataflow Gen1 (analytical) is being retired — specifically, the ability to create new ones. If moving to Microsoft Fabric is not immediately feasible, Power Platform Standard Dataflow (V2) is a practical, low-cost alternative.

It uses the same Power Query transformation engine. It supports the same data sources. It supports incremental refresh. It lets you export your existing Gen1 dataflows as Power Query templates and import them directly. The main differences are:

  • Data lands in Dataverse instead of Azure Data Lake Storage Gen2
  • You explicitly configure the destination table for each query
  • You access the data in Power BI via the Dataverse connector or the Dataflow connector (Environments)
  • Dataverse adds system columns — filter them out before loading into Power BI

If Fabric is your long-term direction, start planning that migration. But if budget is the constraint right now, Power Platform Dataflow is a solid, supported bridge — and it runs on a license you likely already have.


Have questions about your dataflow migration strategy? RADACAD provides training and consulting on Power BI and Microsoft Fabric. Feel free to reach out.

Watch the full walkthrough on YouTube: https://www.youtube.com/watch?v=YqJyBVrb_Cc

Reza Rad on FacebookReza Rad on LinkedinReza Rad on TwitterReza Rad on Youtube
Reza Rad
Trainer, Consultant, Mentor
Reza Rad is a Microsoft Regional Director, an Author, Trainer, Speaker and Consultant. He has a BSc in Computer engineering; he has more than 20 years’ experience in data analysis, BI, databases, programming, and development mostly on Microsoft technologies. He is a Microsoft Data Platform MVP for 12 continuous years (from 2011 till now) for his dedication in Microsoft BI. Reza is an active blogger and co-founder of RADACAD. Reza is also co-founder and co-organizer of Difinity conference in New Zealand, Power BI Summit, and Data Insight Summit.
Reza is author of more than 14 books on Microsoft Business Intelligence, most of these books are published under Power BI category. Among these are books such as Power BI DAX Simplified, Pro Power BI Architecture, Power BI from Rookie to Rock Star, Power Query books series, Row-Level Security in Power BI and etc.
He is an International Speaker in Microsoft Ignite, Microsoft Business Applications Summit, Data Insight Summit, PASS Summit, SQL Saturday and SQL user groups. And He is a Microsoft Certified Trainer.
Reza’s passion is to help you find the best data solution, he is Data enthusiast.
His articles on different aspects of technologies, especially on MS BI, can be found on his blog: https://radacad.com/blog.

Leave a Reply

Your email address will not be published. Required fields are marked *