Power BI on a flat GA4 table —
via BigQuery, or via Azure Blob
if you want to skip GCP.
Power BI sits in a different cloud than the GA4 export. That is not a deal-breaker, but it shapes the choice. Connect to BigQuery directly and you keep one source. Connect to the Decode events_external Parquet output via Azure Blob and you avoid cross-cloud queries entirely. Either way, the events arrive flat: every parameter is a direct column in the model, partition_date is the filter, and Power Query stops being a place where you maintain unnesting logic.
Power BI is the BI tool of choice in Microsoft-aligned organisations. The raw GA4 export was not designed for it.
The shape of the problem
Power Query handles flat tables well and nested ones badly. The BigQuery connector returns the events_* schema with event_params as a list of records, which Power Query renders as a column of [Table] values. Expanding it requires a series of M-language steps — expand the list, expand each record, pivot keys to columns. The query gets long, slow, and hard to inherit. Every refresh runs the same expansion against an ever-larger volume of events.
Why this hurts Power BI specifically
Two reasons. First, Power BI Service refresh windows are bounded — long Power Query expansions push you into incremental refresh territory before you would otherwise need it. Second, the model file ends up tightly coupled to the GA4 schema. A new event parameter is invisible until somebody updates the M code. The model looks healthy. The data is incomplete.
What changes with Decode GA4
Decode writes a flat events table into a BigQuery dataset of your choice, and a Parquet copy in GCS. From Power BI you choose the connector that fits your stack — BigQuery if you want a single source, Azure Blob with the Parquet output if you want to avoid cross-cloud queries entirely. In both cases, the model sees flat columns. Power Query becomes a thin layer that filters partition_date and selects the columns you need.
The traditional approach
Expand event_params in Power Query
Use the BigQuery connector against events_*, expand the event_params column, pivot keys, repeat for items, repeat for ecommerce. The M code is long, slow on refresh, and breaks silently when GA4 adds a parameter you have not handled.
Pre-flatten in BigQuery, then connect
Stand up a scheduled query in BigQuery that flattens events_* into a normal table. Point Power BI at that table. Now you maintain a transformation pipeline in BigQuery whose only purpose is to make Power BI happy.
Use a third-party GA4 → Power BI connector
Subscribe to a SaaS connector that hits the GA4 API and pushes data into a Power BI dataset. You get sampled data, API quotas, vendor lock-in and a recurring bill — and you skip the BigQuery export, which is usually why you set it up in the first place.
Decode GA4 vs Power Query unnesting
| Feature | Decode GA4 events table | Hand-built Power Query |
|---|---|---|
| M steps to surface page_location | Pick the column | Expand, pivot, rename |
| Refresh duration | Filtered partition reads | Full UNNEST and pivot per refresh |
| New GA4 parameter handling | Appears as a new column | Manual M edit required |
| DirectQuery viability | Works on partitioned table | Practical limit on UNNEST queries |
| Cross-cloud option | Yes — Azure Blob Parquet | BigQuery connector only |
| Maintenance over a year | Zero | Recurring, every schema shift |
One install. Two connection options. Power BI handles the rest.
- [ 1 ]
Subscribe via Google Cloud Marketplace
Decode GA4 is a Marketplace listing. Usage-based pricing, no monthly minimum. The subscription takes under a minute and billing appears on your existing GCP invoice.
- [ 2 ]
Pick a connector path
Option A: Power BI Desktop → Get Data → Google BigQuery. Authenticate with OAuth or a service account JSON via the Advanced options. Option B: Skip BigQuery at the connector layer. Copy the events_external Parquet files from GCS to Azure Blob Storage (via Storage Transfer Service, gsutil, or your preferred mechanism) and connect Power BI through the Azure Blob Storage connector — same flat schema either way.
- [ 3 ]
Filter on partition_date in Power Query
Apply a partition_date filter as the first step. Power BI pushes the predicate down to BigQuery (or to the Parquet files), so the model never has to materialise more than the date range you actually use. This is the difference between a one-minute refresh and an hour-long one.
- [ 4 ]
Build the model and publish
Use partition_date as a date dimension, event_name as a categorical filter, event_param.page_location as a URL field. Publish to Power BI Service, set up a refresh schedule, and ship the report. The BigQuery connector supports DirectQuery, which is worth considering when the date range outgrows a comfortable Import refresh window. The Azure Blob Storage path is Import-only.
HOW THE SETUP WORKS
Wire decoded GA4 into a Power BI report in four small steps. Two connector options, same end state — a flat model with no Power Query expansion logic.
GCP
Run the Decode GA4 installer with the events_external template. Parquet files land in GCS; copy them to Azure Blob if you need that connector path.
Power BI
Get Data → Google BigQuery (service account JSON via Advanced) or Azure Blob Storage if you copied the Parquet files across.
Power Query
Filter partition_date as the first step so the predicate pushes down. Select only the columns the report needs.
Power BI Service
Publish, configure a gateway if Import mode, set the refresh schedule.
The events table is a BigQuery external table backed by Parquet files in GCS. Power BI reads it through Microsoft's connector — no third-party services, no copy of your data outside the storage you control.
What you get in Power BI
A flat field list in the model
event_param.page_location, event_param.page_title, geo.country, device.category — every standard parameter is a column the model picks up directly.
Two valid connector paths
BigQuery if you want one source, Azure Blob over Parquet if you want to avoid cross-cloud queries. The schema is the same either way; the choice is about cost, latency and where your refresh runs.
partition_date as a real date dimension
No DAX hacks to derive a date. partition_date is a date column on every row, ready to drive a date table relationship.
DirectQuery on the BigQuery connector
The BigQuery connector supports DirectQuery. With partition_date filters in place, DirectQuery on the events table avoids pulling years of GA4 events into the Power BI model. The Azure Blob Storage path is Import-only.
Predictable refresh behaviour
Refresh duration is dominated by the partition range and the columns you load, not by Power Query expansion logic. Reports that took 40 minutes to refresh against the raw export typically refresh in single-digit minutes against the decoded table.
Schema evolution that just works
When GA4 adds a new event parameter, the next decode run picks it up. Refresh the dataset schema in Power BI and the field is available to add to visuals.
Cross-channel marketing reports
Combine decoded GA4 events with Microsoft Ads, Dynamics 365 or your CRM tables in the same Power BI model. The events arrive flat, so joins use real columns rather than expanded record fields.
Funnel and conversion dashboards
Standard ecommerce funnels — view_item, add_to_cart, begin_checkout, purchase — become DAX measures over event_name. The funnel measure is a counted distinct of session ID per step, not a stack of M expansion steps underneath.
Cross-cloud reporting without GCP exposure
If your wider organisation does not use BigQuery, copy the events_external Parquet files to Azure Blob and point Power BI at the Storage account. The report stays in the Microsoft stack, with the same flat schema as the BigQuery events table.
Should I use BigQuery or Azure Blob as the connector?
It depends. If your stack is comfortable with cross-cloud queries and your team manages GCP, BigQuery keeps everything in one source. If you want refresh and processing to live entirely inside Azure, the events_external Parquet output via Azure Blob is the cleaner path. See setup →
Import or DirectQuery?
The BigQuery connector supports both. Import is simpler and faster for smaller date ranges. DirectQuery with partition_date filters becomes worth considering as the date range grows past what fits comfortably in an Import refresh — it avoids pulling years of data into the model. The Azure Blob Storage connector is Import-only.
Do I need an on-premises gateway?
For BigQuery and Azure Blob connectors against cloud storage, no gateway is required for refresh in Power BI Service. A gateway is only needed if you join against on-premises sources. Full prerequisites →
Can I still use my existing GA4 measures and DAX?
Yes. The dimension and measure names map cleanly across — page_location, event_name, ga_session_id and so on. Most teams are able to swap the source and keep DAX measures untouched.
Deploy in under 5 minutes
Power BI on GA4,
without the Power Query unnesting.
Subscribe via Google Cloud Marketplace, choose between the BigQuery and Azure Blob connector paths, and have a flat events model ready in Power BI before the end of the day.
Google Cloud Marketplace · Usage-based · No monthly minimum