The simplest way to transform your data
Our decoder transforms and enhances incoming GA4 data, preparing it for fast, efficient analysis. It integrates with your existing BigQuery set-up and can be configured and deployed in minutes
Works seamlessly with the storage, transformation, & BI tools you are already using
Works with your current stack











Working with GA4 Data is Challenging
To unlock your data for analysis, visualization, augmentation, and automation, you first need to restructure it. With this comes many complications.
Hybrid data structures make SQL queries difficult and unnesting complicated
Hand-crafted queries require painful, verbose transformations that complicate downstream processes
Constant maintenance as bespoke solutions need updates and tweaking when data changes
Decision paralysis - most companies struggle with how to get data into the right form
Time-consuming development where teams spend weeks building custom solutions instead of analyzing data
Inefficient data processing - The full, raw dataset is reprocessed every configuration change, resulting in unnecessary cost

How Decode Data Helps
Built for data professionals who need reliable, efficient analytics – Our decoder transforms your data in one step – restructuring it for optimal analysis. This removes complexity and speeds up your workflow.
Automate Everything
Extract complex nested data automatically without manual intervention. Say goodbye to complex SQL queries and schema management headaches.
Schema Evolution
Automatically adapt to source schema changes. Never worry about data pipeline breaks when GA4 introduces new parameters or structures.
Native BigQuery
Built entirely within Google BigQuery ecosystem. No external dependencies, seamless integration with your existing data workflows.
Cross-Cloud Ready
Export to AWS S3 or Azure Blob Storage for robust cross-cloud workflows. Your data, wherever you need it.
Cost Optimization
Reduce storage costs with intelligent compression and partitioning. Pay less for BigQuery storage while maintaining query performance.
Zero Maintenance
Deploy once, run forever. Metadata-driven incremental processing ensures each date partition is processed exactly once with minimal overhead.
Automate Everything
Extract complex nested data and update column and parameter schemas automatically without manual intervention.
Cost Optimization
Reduce storage costs with intelligent compression and partitioning. Pay less for BigQuery storage while maintaining query performance.
Schema Evolution
Automatically adapt to source schema changes. Never worry about data pipeline breaks when GA4 introduces new parameters or structures.
Cross-Cloud Ready
Export to Google Cloud Storage, AWS S3 or Azure Blob Storage for robust cross-cloud workflows and data lake architectures.
Native BigQuery
Built entirely within Google BigQuery ecosystem. No external dependencies, seamless integration with your existing data workflows.
Zero Maintenance
Deploy once, run forever. Metadata-driven incremental processing ensures each date partition is processed exactly once with minimal overhead.
Enhance your Data Further
- Locally timestamped events - Supports actual time-of-day activity analysis
- Geolocated events - Supports map-based data exploration and reporting
- Automatic parameter propagation - Seamlessly include new parameters and properties without configuration changes
- Single source of truth - Build multiple external tables on single, consistent data source, eliminating data duplication
- Automatic schema evolution - Seamlessly include new fields and sub-fields without any configuration changes
- Built-in notifications - Add Slack notifications to inform users of any changes to column or parameter schemas
- Process extensibility - Trigger any downstream process or custom notification with a PubSub message upon new data
- Logic Extensibility - Configuration-bases extension of transformation logic and downstream resource creation I know there's probably some repetition here but it'll be a good next evolution









Quick Setup
- Easy install with a single command
- No additional transformation needed
- Works with your current stack
Enable GA4 Exports
- Flattens and pre-models the data
- Removes long SQL queries
- Makes analysis faster
Speed Up Data Operations
- Automates transformation of raw GA4 data
- Converts data to zipped Parquet files
- Creates BigQuery external tables
Optimize Storage & Compute
- Stores transformed data in GCS
- Reduces storage costs
- Supports incremental updates
Boost Analytics Flexibility
- Profiles GA4 structures
- Supports local time
- Integrates with DBT, DataForm, SQLMesh, and many more
Future-Proof & Easy Maintenance
- Adapts to schema changes
- Auto-generates transformation functions
- Simple to extend
Cross-Cloud Flexibility
- Export to S3 or Azure
- No vendor lock-in
- Same structure across platforms
Reliable and Scalable
- Incremental logic for updates
- No full reprocessing
- Scalable architecture
Integrates with Your Data Stack
- Outputs clean Parquet files
- Plug into existing pipelines
- Ready for ML, analytics
FAQ
Once I’ve decoded my data, what can I do next?
You can analyze it, transform it, or send it somewhere else.
Analyze your decoded data using Looker Studio, Looker, Power BI, Tableau or Evidence! Our Decoder for GA4 comes with ready-made Looker Studio templates, but you can easily link decoded data to your existing reports.
Transform your decoded data using Dataform or DBT.
Send your decoded data to Google Cloud Storage or Amazon S3.
Can I really install the decoder with a single command?
That’s right. It’s a simple and straight forward install.
Do I need to use Dataform or DBT?
Absolutely not. There is no need to use other data transformation tools or learn another complicated skillset. That said, our decoder is fully compatible with industry standard transformation tools so it can be a preliminary step to simplify your data before you use these tools.
Will the decoder alter my raw data in anyway?
Absolutely not. Our decoder adds a new dataset to your BigQuery that you can access directly for downstream analysis, but you can always go back and reference your raw data or do work on that data set directly.
What happens to my data if I stop using Decode Data's Decoder?
The historical data that you have decoded remains intact and accessible in your BigQuery, but it will cease to decode any future data.
Technical questions? Check out our docs site to learn more.