Optimizing Google BigQuery Service Connector
We are currently funneling our Google Analytics 4 (GA4) data into BigQuery, and then connecting to BigQuery with 'Google BigQuery Service Connector' in Domo. We are partitioning using date keys and bringing in the past 3 days of data (to account for attribution in GA4) with an append. However, our daily updated rows ranges between 1MM - 2MM+ every day - which is not the best for consumption purposes. Has anyone experienced similar issues, and if so, how did you optimize? Any/all recommendations would be helpful here! Thank you!
Comments
-
Out of curiosity, why are you going the route of BQ instead of using the native GA4 connector?
0 -
@PMLeema we're using the BQ route since the native GA4 connector is (and was when we were implementing) in Beta. We were seeing discrepancies from the native view, but saw accurate data flowing in from BQ.
Simple: 'SELECT * FROM…' our BQ account in Domo.
Within BQ itself, we are selecting user_pseudo_id, event_timestamp, event_date, country, creating groupings based on traffic sources and mediums, 'page_title', 'ga_session_id', traffic source name, medium and source (and labeling as campaign, medium, and source respectively), event_name, event_key, and event_value.Let me know what you think here.
Thanks!0
Categories
- All Categories
- 1.5K Product Ideas
- 1.5K Ideas Exchange
- 1.4K Connect
- 1.1K Connectors
- 278 Workbench
- 4 Cloud Amplifier
- 4 Federated
- 2.7K Transform
- 89 SQL DataFlows
- 561 Datasets
- 2K Magic ETL
- 3.4K Visualize
- 2.3K Charting
- 580 Beast Mode
- 13 App Studio
- 28 Variables
- 584 Automate
- 142 Apps
- 415 APIs & Domo Developer
- 26 Workflows
- 1 DomoAI
- 28 Predict
- 12 Jupyter Workspaces
- 16 R & Python Tiles
- 357 Distribute
- 96 Domo Everywhere
- 259 Scheduled Reports
- 2 Software Integrations
- 92 Manage
- 89 Governance & Security
- 9 Product Release Questions
- Community Forums
- 37 Getting Started
- 28 Community Member Introductions
- 89 Community Announcements
- 4.8K Archive