Tableau over a flat GA4 table —
real dimensions and measures,
no custom SQL data sources.

Tableau is at its best when columns map cleanly to dimensions and measures. The raw GA4 export does not. To use it well in Tableau, most teams either write a Custom SQL data source that wraps an UNNEST query, or build calculated fields per parameter on top of nested records. Both work. Both decay. Decode GA4 collapses both into a normal data source — the events table, connected through Tableau's native BigQuery driver.

Connector: Google BigQuery (native) Clients: Desktop and Cloud Auth: OAuth or service account Maintenance: zero custom SQL
Summarize This ChatGPT Perplexity

Tableau lives on flat columns. The raw GA4 export is the wrong shape for it.

The shape of the problem

The Tableau BigQuery connector returns the events_* schema as it stands. event_params is a repeated record, which Tableau cannot pivot into dimensions on its own. The standard workaround is a Custom SQL data source — an UNNEST query pasted into the connection that flattens parameters before Tableau sees them. The query is fast in development, slower at scale, and very awkward to inherit.

Why this hurts Tableau workbooks specifically

Two reasons. First, Custom SQL data sources hide the actual schema. Anyone opening the workbook for the first time has to read the SQL to know what they are looking at. Second, Tableau aggressively re-runs the source query for new aggregations and filters. With UNNEST in the source, every quick filter change pays the cost of the unnesting. The dashboard feels slow for reasons that are not visible in the workbook.

What changes with Decode GA4

Decode writes a flat events table into a BigQuery dataset of your choice. In Tableau you connect to BigQuery, drag the events table onto the canvas, and the schema appears as a normal flat table. event_param.page_location, event_param.page_title, geo.country and partition_date are real columns. No Custom SQL. Filters and quick filters operate against partitioned columns and run quickly.

Option A

Custom SQL data source with UNNEST

Paste a SELECT with one CROSS JOIN UNNEST per parameter into the BigQuery connection. Every aggregation, filter and viz built on the workbook ultimately runs that query. The source is opaque to anyone who did not write it.

Custom SQL hidden in the connection
Option B

Calculated fields on event_params

Connect to events_* directly and define one calculated field per parameter — RAWSQL-style or via Tableau's record functions. The workbook accumulates dozens of calculated fields whose only job is to surface a single key. Performance is acceptable on a sample and degrades on real volume.

Calculated-field sprawl
Option C

Nightly extract of a flattened view

Build a BigQuery view that flattens GA4, point a Tableau extract at it, refresh nightly. The view becomes a piece of warehouse infrastructure whose only consumer is one Tableau workbook. When GA4 adds a parameter, the view silently drops it.

A pipeline you did not want
Feature Decode GA4 events table Custom SQL data source
What appears in the data paneFlat columns, real schemaOutput of an UNNEST query
Quick filter responsivenessPartition pruning worksUNNEST runs per filter change
New GA4 parameter handlingAppears as a new columnManual SQL update required
Calculated fields needed for basic paramsNoneOne per parameter
Workbook readability for new analystsConnect, drag, buildRead the SQL first
Maintenance over a yearZeroRecurring, every schema shift

One install. A flat table. Tableau handles the rest.

  1. [ 1 ]

    Subscribe via Google Cloud Marketplace

    Decode GA4 is a Marketplace listing. Usage-based pricing, no monthly minimum. The subscription takes under a minute and billing appears on your existing GCP invoice.

  2. [ 2 ]

    Connect Tableau to BigQuery

    Tableau → Connect → Google BigQuery. Authenticate with OAuth, or use a service account JSON for unattended refresh. Set Billing Project, Project, and Dataset to your destination_dataset_id.

  3. [ 3 ]

    Drag the events table onto the canvas

    The events table appears in the dataset list. Drag it onto the canvas. Tableau reads the schema and lists every parameter as a column. There is no Custom SQL step.

  4. [ 4 ]

    Build the workbook and publish

    Use partition_date as the date filter — recommended on every sheet to avoid full table scans. Drag event_name, event_param.page_location, geo.country onto rows and columns. Publish to Tableau Cloud (Server → Publish Workbook) and embed BigQuery credentials or OAuth so viewers can refresh without re-authenticating.

Wire decoded GA4 into a Tableau workbook in four small steps. The same flow works for Tableau Desktop and Tableau Cloud — the only difference is where the workbook lives.

01

GCP

Run the Decode GA4 installer. Set destination_dataset_id to the dataset Tableau will read from.

02

GCP

Create a service account with BigQuery Data Viewer and BigQuery Job User, plus Storage Object Viewer on the Decode GCS bucket. Download the JSON for unattended refresh.

03

Tableau

Connect → Google BigQuery → service account JSON. Pick the dataset and drag the events table onto the canvas.

04

Tableau Cloud

Publish the workbook. Embed credentials so refresh runs without manual login.

The events table is a BigQuery external table backed by Parquet files in GCS. Tableau queries it through Google's native BigQuery driver. Data stays in your GCP project; Tableau holds query results, not the underlying storage.

01

A normal flat data source

The events table appears in the data pane as a flat table. No Custom SQL. No nested record icons. Anyone opening the workbook can see the schema at a glance.

02

Direct dimensions and measures

page_location, page_referrer, page_title, ga_session_id, geo.country, device.category — every standard parameter is a real column that maps to a Tableau dimension without a calculated field.

03

partition_date as a real date

partition_date is a date column on every row. Filter on it from sheet level — Tableau pushes the predicate down to BigQuery and partition pruning kicks in.

04

Faster quick filters

Quick filter changes hit the partitioned events table directly. The UNNEST query that used to run for every filter interaction is gone, so the dashboard responds to user input in seconds rather than tens of seconds.

05

Schema evolution that just works

When GA4 adds a new event parameter, the next decode run picks it up. Refresh the data source schema in Tableau and the field is available to drag onto the view.

06

Same setup for Desktop and Cloud

Tableau Desktop, Tableau Server, Tableau Cloud — the connection, the schema and the dimensions are identical. Authoring on Desktop and publishing to Cloud is a one-click operation.

01

Marketing performance dashboards

Source-medium, campaign and landing-page breakdowns built from direct columns. The traffic-source dashboard that used to require a Custom SQL data source becomes a plain table connection.

02

Funnel and conversion analysis

Standard ecommerce funnels — view_item, add_to_cart, begin_checkout, purchase — become calculated measures over event_name. The funnel sheet is a series of COUNTD calculations on session ID, not a Custom SQL artefact.

03

Embedded analytics in customer-facing apps

Publish to Tableau Cloud, embed views via the Embedding API, and drive customer-facing reporting from the decoded events table. Filtering by tenant ID is a real column filter, not an UNNEST predicate.

Does this work with Tableau Desktop and Tableau Cloud?

Both. The integration is at the warehouse layer — the BigQuery driver sees the decoded events table the same way regardless of which Tableau client is connecting. Workbooks author in Desktop and publish to Cloud without modification. See setup →

Should I use Live or Extract connections?

For most GA4 dashboards, a live connection over the partitioned events table is the right default — partition pruning keeps queries cheap, and viewers always see fresh data. Extracts make sense if you have a defined date range and want to insulate dashboards from BigQuery slot pressure.

What permissions does the connection need?

The standard BigQuery Data Viewer and BigQuery Job User roles, plus Storage Object Viewer on the Decode GA4 GCS bucket — required because the events table is an external table backed by Parquet files. Full prerequisites →

Will my existing calculated fields still work?

Most yes. Calculated fields that wrap UNNEST or RAWSQL to extract a parameter become unnecessary — replace them with the direct column reference. Calculated fields that compute genuine business logic, like session windows or conversion flags, port across as-is.

Deploy in under 5 minutes

Tableau on GA4,
without the Custom SQL data source.

Subscribe via Google Cloud Marketplace, point destination_dataset_id at the dataset Tableau reads from, and have a flat events workbook ready to publish before the end of the day.

Get Started on Marketplace → Read the documentation

Google Cloud Marketplace · Usage-based · No monthly minimum