Integrations
Connect to your entire data stack.
Decode GA4 transforms the raw BigQuery export into clean, flat event tables. Every tool downstream works without modification — the query tools, warehouses, and BI platforms your team already uses.
Cloud Storage
All cloud storage →Amazon S3
Export flat, ZSTD-compressed GA4 event data to S3. Query with Athena, load into Redshift, or feed SageMaker — without custom pipelines or ongoing maintenance.
Azure Blob Storage
Send decoded GA4 event data to Azure Blob Storage for use with Synapse Analytics, Microsoft Fabric, or Power BI DirectLake.
Google Cloud Storage
The default export destination. Parquet files in GCS, queryable via BigQuery external tables. Lowest egress cost — data stays within Google Cloud.
Warehouse
All warehouse →Transformation
All transformation →dbt
Use decoded GA4 tables as source models in your dbt project. Flat columns from day one — no staging UNNEST logic in your transforms.
Dataform
Reference decoded GA4 tables in your Dataform project. Every event parameter available as a column — no UNNEST expressions in your SQLX files.
SQLMesh
Drop decoded GA4 tables into SQLMesh as external models. Schema evolution handled upstream — your downstream models stay stable when GA4 changes.
Bruin
Reference decoded GA4 tables as sources in a Bruin pipeline. AI-assisted pipeline authoring on top of pre-flattened GA4 — skip the UNNEST staging entirely.
Orchestrate
All orchestrate →Airflow
Trigger the Decode GA4 procedure from an Airflow DAG. Runs on whatever cadence your existing data jobs run on — no separate scheduler to maintain.
Dagster
Model the Decode GA4 run as a Dagster asset. Lineage, sensors, and materialisation policies wrap the decode step like any other asset in your graph.
Kestra
Run Decode GA4 as a BigQuery task inside a Kestra flow. YAML-defined, declarative, composable with every other Kestra plugin.
Orchestra
Schedule and monitor the Decode GA4 procedure inside Orchestra. Cross-cloud observability for the full path from raw GA4 to decoded tables.
GitHub Actions
Trigger the Decode GA4 procedure from a GitHub Actions workflow. CI-style cron plus the BigQuery CLI — no standalone orchestrator required.
Semantic
All semantic →Cube
Define GA4 metrics in Cube against decoded flat tables. No UNNEST logic in your cube model — event parameters are already direct columns.
MetricFlow
Model GA4 metrics in MetricFlow directly against decoded BigQuery tables. Clean column structure means leaner YAML and less semantic-layer glue.
Steep Metrics
Use Steep Metrics as the semantic layer over decoded GA4 data. Shared metric definitions feed every downstream consumer.
BI
All bi →Looker Studio
Connect Looker Studio directly to decoded GA4 tables. Build dashboards without calculated field workarounds or UNNEST-based blended data sources.
Looker
Define GA4 event dimensions in LookML against clean flat tables. No derived table workarounds for standard event parameters.
Power BI
Connect Power BI via the BigQuery connector or export to Azure Blob for DirectLake. Clean event columns — no complex M query transformations.
Tableau
Connect Tableau to decoded GA4 tables. Flat columns map directly to dimensions and measures — no custom SQL required in the data source.
Rill
Point Rill at decoded GA4 in BigQuery or directly at the GCS Parquet export. DuckDB-backed dashboards stay fast at scale, and the flat column layout maps directly to Rill measures.
Steep
Explore decoded GA4 in Steep. Native BigQuery connection over flat tables — the decoded structure keeps answers fast and predictable.
Evidence
Build self-updating reports in Evidence using SQL inside Markdown. The decoded events table connects natively via the BigQuery adapter.
Evidence Studio
The managed cloud version of Evidence. Same BigQuery connector, same Markdown SQL, no local installation — auto-publishes on every save or push.
Lightdash
Lightdash reads your dbt models directly. The natural next step for self-serve analytics on top of decoded GA4 data already modelled in dbt.
Analytics
All analytics →DuckDB
Query GA4 Parquet exports locally with DuckDB. Sub-second queries on years of event history. No warehouse required, no cloud compute, no minimum scan charges.
MotherDuck
Cloud DuckDB that queries decoded GA4 Parquet directly from GCS. Persistent secrets, shared workspaces, and the same flat columns DuckDB exposes locally.
Deploy in under 5 minutes
One deployment.
Every integration.
Subscribe via Google Cloud Marketplace, configure your destination, and have clean decoded GA4 data flowing to your stack before the end of the day.
Google Cloud Marketplace · Usage-based · No monthly minimum