Decoded GA4 in Evidence —
SQL-in-Markdown reports
without the UNNEST mess.

Evidence is an open-source BI framework that builds reports out of SQL queries written inside Markdown files. Self-updating, version-controlled, deployable as a static site. Decoded GA4 events make the SQL inside the Markdown short enough to actually read.

Connection: BigQuery Format: SQL-in-Markdown Runtime: Node.js 20+ Deploys as: static site
Summarize This ChatGPT Perplexity

Evidence puts SQL into a Markdown file and renders the result as a chart. The framework is only as good as the SQL you have to write inside it. Decoded GA4 keeps that SQL short.

What Evidence is for

Evidence is for writing reports that look like documents. The author writes a Markdown file with a few SQL blocks, drops in a chart component below each block, commits the file to Git, and the report builds and deploys. Reviewers see the prose and the charts together. Analysts see the SQL inline. There is no separate dashboarding tool, no point-and-click layer, no extract step.

Why nested GA4 makes Evidence reports painful

Because the SQL is in the Markdown file, anyone reading the report sees it. A query that goes "select date, count(*) from events where event_name = 'page_view'" reads naturally next to the chart. A query that needs three CROSS JOIN UNNEST clauses to get the same answer does not. The whole point of Evidence — readable, version-controlled SQL — falls apart when the schema underneath is hostile.

What changes with Decode GA4

Decode flattens the events table inside BigQuery. The SQL blocks in your Markdown read as plain GROUP BY queries against direct columns. The reports stay readable. The Git diffs stay diff-able. When GA4 adds a new parameter, the next decode run picks it up and your existing SQL keeps working.

Option A

Write UNNEST queries in every Markdown file

Each Evidence report ends up with its own copy of the UNNEST scaffolding before the actual analytical SQL begins. Reviewers spend more time parsing the flatten than reading the question. Updates ripple across every file when GA4 changes.

Reports become unreadable
Option B

Maintain a flattening view in BigQuery

Build a SQL view that flattens GA4 once, point Evidence at the view. Tidy in principle. In practice, the view becomes the most important undocumented piece of infrastructure in the project, and it breaks every time the schema shifts.

A hidden dependency to maintain
Option C

Run dbt before Evidence

Stand up a dbt project to flatten and stage GA4, then have Evidence read the dbt outputs. Two tools to operate, two release cadences, two places where the schema can drift. Workable, but heavy for what should be a simple report.

Two tools doing one job
Feature Decode GA4 source Raw GA4 export
SQL inside Markdown3-line GROUP BYUNNEST scaffolding everywhere
Report readabilityQuestion, then chartFlatten, then question, then chart
Git diffs on schema changeNo code changesUpdate every report file
Build cost in BigQueryPartition-pruned scansUNNEST inflation per build
Onboarding a new analystKnows BigQuery alreadyMust learn UNNEST first
Maintenance over a yearZeroRecurring SQL updates

Install Evidence, point it at BigQuery, write a Markdown file. The decoded events table makes the SQL inside it readable.

  1. [ 1 ]

    Subscribe via Google Cloud Marketplace

    Decode GA4 ships as a Marketplace listing. Usage-based pricing, no monthly minimum, billed through your existing GCP invoice. Subscription takes about a minute.

  2. [ 2 ]

    Initialise an Evidence project

    npm create evidence@latest, cd into the new project, run npm install. You will need Node.js 20 or newer. The default project boots with sample SQL, ready to swap for real queries.

  3. [ 3 ]

    Configure BigQuery in Settings → Data Sources

    Enter the GCP project ID and paste a service account JSON with BigQuery Data Viewer plus Job User. Or commit the same config to evidence.config.yaml — Evidence reads either path.

  4. [ 4 ]

    Write a Markdown file with a SQL block

    Inside pages/index.md, add a SQL block that selects partition_date, event_param.page_location, count(*) as pageviews from the decoded events table. Drop a LineChart component below it. Run npm run dev and the report renders.

Wire decoded GA4 into Evidence in four steps. Standard Evidence flow — only the SQL inside your Markdown is shorter than it would otherwise be.

01

GCP

Run the Decode GA4 installer with the events_external template. Pick the dataset Evidence will read from.

02

GCP

Create a service account with BigQuery Data Viewer and Job User. Download the JSON key.

03

Evidence

Add a BigQuery source in Settings → Data Sources or commit the config to evidence.config.yaml.

04

Evidence

Write a SQL block in a Markdown file. Drop a chart component below it. npm run dev, then ship.

Evidence reads the decoded events table through a service account in your project. The data stays in BigQuery — Evidence renders it at build time. The output is a static site you can host anywhere.

01

SQL inside Markdown that is actually readable

The query above the chart is a normal GROUP BY against direct columns. Reviewers can read it. Future-you can read it in six months. The framework's value proposition lands.

02

Self-updating reports

Each build re-runs the SQL against the latest decoded events. New data appears in the next deploy without anyone touching the Markdown.

03

Reports as code, in Git

Reports live in a repository. Pull requests review SQL and prose together. CI builds the static site on merge. Standard software engineering practice for analytics output.

04

Static-site economics

The output of Evidence is a folder of HTML and JavaScript. Host it on Cloudflare Pages, Netlify, S3 — anywhere. Viewers do not hit BigQuery directly; only the build does.

05

Schema evolution that does not break reports

New GA4 parameters become new columns in the decoded events table on the next decode run. Existing reports keep working — they just have one more column they could optionally use.

06

Same source for the rest of your stack

The decoded events table is also the source for dbt, Looker Studio, Steep, Rill. Evidence is one consumer of many on the same upstream.

01

Weekly investor or board reports

One Markdown file per week with the SQL, the prose, and the charts together. Built and deployed by CI on Monday morning. The same file is the source of truth and the report — there is no spreadsheet to copy numbers into.

02

Marketing campaign post-mortems

A campaign report sits in a folder of the repo with its own SQL queries against decoded events — sources, mediums, landing pages, conversions. Six months later the analysis is still reproducible because the queries are still committed.

03

Internal product KPI documents

Product teams maintain a Markdown file per feature, with the activation, retention, and engagement SQL inline. Anyone can read the file to see exactly what the metric is. The chart updates as the data does.

Do I need Node.js installed locally to use Evidence?

For local dev, yes — Node.js 20 or newer. For production builds, anywhere Node runs (CI, Cloud Run, a build server). If you would rather not deal with Node, look at Evidence Studio, which hosts the runtime for you. See Evidence Studio →

What permissions does the service account need?

BigQuery Data Viewer and BigQuery Job User on the project containing the decoded events table. No write access. Full prerequisites →

Can I version-control the BigQuery credentials?

You can version-control the source config — project ID, dataset name, query metadata — but not the service account JSON key. Keep the key in your CI secret store and inject it at build time, the same way you would for any other deploy.

What happens to the report when GA4 adds a new event parameter?

Nothing. Existing SQL keeps running because it references the columns it always referenced. The new parameter shows up as a new column in the decoded events table, ready to use the next time you write a report that needs it.

Deploy in under 5 minutes

Reports as code,
without the UNNEST inside.

Subscribe via Google Cloud Marketplace, point destination_dataset_id at the dataset Evidence reads, and write your first SQL-in-Markdown report against decoded events before lunch.

Get Started on Marketplace → Read the documentation

Google Cloud Marketplace · Usage-based · No monthly minimum