Looker Studio reports on a
flat events table — no blended
sources, no UNNEST.

Looker Studio is Google's free, browser-based BI tool. It connects natively to BigQuery, which makes it the obvious place to build GA4 dashboards. The catch is the schema. Pointing Looker Studio at the raw events_* export forces you into calculated fields and blended data sources to extract anything useful from event_params. Decode GA4 removes that step. Every parameter is already a direct column in the events table.

Connector: BigQuery (native) Cost: free Source: events table Maintenance: zero calculated fields
Summarize This ChatGPT Perplexity

Looker Studio is the natural front end for GA4 in BigQuery. The raw export is the wrong shape to use it well.

The shape of the problem

Looker Studio's BigQuery connector is excellent at flat tables. Drag a column onto the canvas, get a chart. The GA4 export is not flat. Every parameter you actually want — page_location, page_referrer, ga_session_id — lives inside a repeated event_params record. The connector cannot drill into nested fields directly, so the workaround is custom SQL in the data source: a CASE statement per parameter inside a CROSS JOIN UNNEST. The data source becomes a SQL artefact in its own right, maintained inside Looker Studio rather than in your warehouse.

Why this hurts dashboards specifically

Two reasons. First, every additional parameter requires an editor with permission to modify the data source, which slows down report iteration. Second, blended data sources — the usual fallback when one source is not enough — are limited in their join keys and aggregation behaviour. Teams end up rebuilding the same custom SQL across multiple reports because data sources are not easy to share between workspaces.

What changes with Decode GA4

Decode writes a flat events table into a BigQuery dataset of your choice. In Looker Studio you point the BigQuery connector at that table and the schema appears as columns: event_param.page_location, event_param.page_title, geo.country, device.category. No custom SQL in the data source. No calculated fields for standard parameters. Building a new chart is a drag-and-drop operation again.

Option A

Custom SQL in the data source

Write a SELECT with one CROSS JOIN UNNEST per parameter and paste it into the BigQuery connector's custom query field. Every chart now reads from a query that runs on every refresh. Costs scale with the number of parameters in the SELECT, not the number of fields the chart actually uses.

Custom SQL hidden in the report
Option B

The official GA4 connector

Use the GA4 connector instead of BigQuery. It handles the schema, but you give up sampled-vs-unsampled control, the ability to backfill, the ability to join other warehouse tables, and any chart over more than the GA4 quota window. You are back inside the GA4 UI's limits — the reason you exported to BigQuery in the first place.

Loses the point of the export
Option C

Maintain a flattened view in BigQuery

Pre-flatten the events_* export into a normal table, schedule it nightly, point Looker Studio at the result. Now you own a transformation pipeline whose only purpose is to reshape data for one BI tool. When GA4 adds a parameter, the view silently drops it until someone notices.

A pipeline you did not want
Feature Decode GA4 events table Custom SQL in connector
Adding a new chart on a new parameterDrag from field listEdit data source SQL
Refresh cost per report loadScans only fields usedRuns full UNNEST query
New GA4 parameter handlingAppears as a new columnManual SQL edit required
Sharing data sources across reportsOne BigQuery table, many reportsSQL gets copied between sources
Blended sources needed for basic joinsNo — joins happen in BigQueryOften, with awkward join keys
Maintenance over a yearZeroRecurring, every schema shift

One install. A flat table. Looker Studio handles the rest.

  1. [ 1 ]

    Subscribe via Google Cloud Marketplace

    Decode GA4 is a Marketplace listing. Usage-based pricing, no monthly minimum. The subscription takes under a minute and billing appears on your existing GCP invoice.

  2. [ 2 ]

    Deploy with the events template

    Pick a BigQuery dataset for the decoded output and set destination_dataset_id. Decode writes a flat events table into it, partitioned by partition_date.

  3. [ 3 ]

    Create a new Looker Studio report

    Add data → BigQuery → select your GCP project, the destination_dataset_id, and the events table. The schema appears as a flat field list with every event parameter as a direct column.

  4. [ 4 ]

    Build the dashboard

    Drag partition_date onto the time axis. Drag event_name, event_param.page_location, geo.country onto dimensions. Use Record Count as the default measure. No calculated fields. No custom SQL. Charts behave the same as they would over any normal warehouse table.

Wire decoded GA4 into a Looker Studio report in four small steps. Nothing here is Looker-Studio-specific magic — it is the same BigQuery connector flow you would use for any flat warehouse table.

01

GCP

Run the Decode GA4 installer. Set destination_dataset_id to the dataset you want Looker Studio to read from.

02

GCP

Confirm the Looker Studio user (or service account) has BigQuery Data Viewer and BigQuery Job User on the project.

03

Looker Studio

Create a report. Add data → BigQuery → pick the events table. No custom SQL.

04

Looker Studio

Build charts using partition_date, event_name, event_param.page_location, Record Count.

The events table is a BigQuery external table backed by Parquet files in GCS. Looker Studio reads it through the native BigQuery connector. No data leaves your project, and Looker Studio never sees the underlying storage.

01

A flat field list, not nested records

The connector exposes every event parameter as a direct column. event_param.page_location, geo.country, device.category — drag them onto a chart, no editing required.

02

No custom SQL in the data source

The data source is a plain BigQuery table reference. Editors can add fields without writing SQL, and you can audit what the report depends on at a glance.

03

partition_date as the temporal axis

Every event row has partition_date. Time-series charts use it as the x-axis directly, and date range controls work without intermediate calculated fields.

04

New parameters appear automatically

When GA4 starts firing a new event parameter, the next decode run picks it up. Refresh the data source schema in Looker Studio and the field is available.

05

Cheaper report refreshes

Looker Studio scans only the fields a chart actually uses. There is no UNNEST query running underneath that pulls everything regardless of what the chart needs.

06

One source, many reports

The events table is a normal BigQuery table. Point as many reports at it as you like. There is no SQL to copy between data sources, and no version drift.

01

Marketing channel dashboards

Source-medium, campaign and landing-page breakdowns built from direct columns. The traffic-source dashboard that used to require three calculated fields and a blended source becomes a single chart.

02

Page performance reports

Top pages by event count, by user count, by engagement time. event_param.page_location and event_param.page_title are direct dimensions, so the report does not need to maintain a derived "page" field.

03

Custom event tracking

When the product team adds a new event parameter — feature flag, plan tier, A/B variant — it shows up as a column in the next decode run. Adding it to a Looker Studio chart takes seconds, not a SQL edit.

Do I need a paid Looker Studio plan?

No. Looker Studio is free for individuals and teams. Looker Studio Pro adds enterprise features like team workspaces and SLAs but is not required to use the BigQuery connector. See setup →

Will my existing reports still work?

Yes. Decode GA4 writes to a new BigQuery table — it does not touch the original events_* export. Existing reports keep reading from where they always did. You can migrate one report at a time.

What permissions does the report viewer need?

Either the report uses owner credentials — viewers see data without any BigQuery permissions — or it uses viewer credentials, in which case each viewer needs BigQuery Data Viewer on the project and BigQuery Job User. Full prerequisites →

Can I still use blended data sources?

Yes, blended sources still work. You will need them less often, because the events table already includes most of the dimensions that previously required blending against a second source.

Deploy in under 5 minutes

Looker Studio over GA4,
without the calculated fields.

Subscribe via Google Cloud Marketplace, point destination_dataset_id at the dataset Looker Studio reads from, and have a flat field list available in your reports the same day.

Get Started on Marketplace → Read the documentation

Google Cloud Marketplace · Usage-based · No monthly minimum