Optimizing Google BigQuery Service Connector

We are currently funneling our Google Analytics 4 (GA4) data into BigQuery, and then connecting to BigQuery with 'Google BigQuery Service Connector' in Domo. We are partitioning using date keys and bringing in the past 3 days of data (to account for attribution in GA4) with an append. However, our daily updated rows ranges between 1MM - 2MM+ every day - which is not the best for consumption purposes. Has anyone experienced similar issues, and if so, how did you optimize? Any/all recommendations would be helpful here! Thank you!

2
2 votes

Comments

  • Tasleema
    Tasleema Domo Product Manager

    Out of curiosity, why are you going the route of BQ instead of using the native GA4 connector?

  • Tasleema
    Tasleema Domo Product Manager

    @dan_s can you provide the query you are using?

  • dan_s
    dan_s Member

    @PMLeema we're using the BQ route since the native GA4 connector is (and was when we were implementing) in Beta. We were seeing discrepancies from the native view, but saw accurate data flowing in from BQ.

    Simple: 'SELECT * FROM…' our BQ account in Domo.

    Within BQ itself, we are selecting user_pseudo_id, event_timestamp, event_date, country, creating groupings based on traffic sources and mediums, 'page_title', 'ga_session_id', traffic source name, medium and source (and labeling as campaign, medium, and source respectively), event_name, event_key, and event_value.

    Let me know what you think here.

    Thanks!