How to get Batch_Run_ID with google sheets?
Hallo, I am looking for a solution to a problem where I need to compare data from today with yesterday's. But the date column is not provided in the data. Also, google sheets does not get batch run id and date. So I have no reference point to compare data. I tried recursive data modelling, but could not solve the problem. Could somebody please give me some idea to reach the solution for this problem? I just need to get some reference point on ETL level to compare data for two consecutive days. Thanks.
Best Answers
-
Hallo @GrantSmith! Yes I did, as I thought it would then get updated automatically everyday and that would be my point of reference for comparison. But it only appears in the first update (or first batch) as I had get rid of the date column for the append to happen in next batch. So, the appending is working but date column is empty. It brought me back to same situation.
0 -
You'd need to add it to your input dataset as the first step, then when you go to add it to your recursive dataset it should have the different values stored in there for when the input is added. Do you have a screen shot of your ETL?
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**1 -
@Ashwin_SG if you use a recursive dataflow to datestamp your ETL based on when you run the ETL, it can be done... but comes with risks you need to consider.
If you assume that you ONLY run your recursive once a day then you kind of defeat the purpose of a recursive (which should allow you to run your dataflow as often as you want without detrimental impact).
You could extend your logic to keep the last set of rows per day. that isn't part of the standard KB documentation but should be easy enough to implement using a rank / window function after you APPEND historical and new datasets together.
If it were me, i'd alter the googlesheet to include a now() function. I assume that would always calculate the time the dataset was ingested. it certainly would timestamp the raw dataset based on ingest time instead of ETL execution time.
Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"0
Answers
-
Hi @Ashwin_SG
When you tried your recursive dataflow did you add a new column / constant contained the current date and time before your output dataset?
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**1 -
Hallo @GrantSmith! Yes I did, as I thought it would then get updated automatically everyday and that would be my point of reference for comparison. But it only appears in the first update (or first batch) as I had get rid of the date column for the append to happen in next batch. So, the appending is working but date column is empty. It brought me back to same situation.
0 -
You'd need to add it to your input dataset as the first step, then when you go to add it to your recursive dataset it should have the different values stored in there for when the input is added. Do you have a screen shot of your ETL?
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**1 -
Thank you Grant. I found the solution for my problem. I don't know how would I solved this problem without your advice. Since I did not had any date related column in my dataset, I built a recursive dataflow and added Date column in there. Then I took that dataset into another ETL and did comparison for today and yesterday's data with reference to that date column. So, as my google sheets gets updated, so does the date column and it appends in the process.
1 -
@Ashwin_SG Glad you got it sorted out! If you could accept my solution so that others who are searching for your issue could find it more easily I'd appreciate it.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**1 -
@Ashwin_SG if you use a recursive dataflow to datestamp your ETL based on when you run the ETL, it can be done... but comes with risks you need to consider.
If you assume that you ONLY run your recursive once a day then you kind of defeat the purpose of a recursive (which should allow you to run your dataflow as often as you want without detrimental impact).
You could extend your logic to keep the last set of rows per day. that isn't part of the standard KB documentation but should be easy enough to implement using a rank / window function after you APPEND historical and new datasets together.
If it were me, i'd alter the googlesheet to include a now() function. I assume that would always calculate the time the dataset was ingested. it certainly would timestamp the raw dataset based on ingest time instead of ETL execution time.
Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 102 SQL DataFlows
- 626 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 755 Beast Mode
- 61 App Studio
- 41 Variables
- 693 Automate
- 178 Apps
- 456 APIs & Domo Developer
- 49 Workflows
- 10 DomoAI
- 38 Predict
- 16 Jupyter Workspaces
- 22 R & Python Tiles
- 398 Distribute
- 115 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 130 Manage
- 127 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 11 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 110 Community Announcements
- 4.8K Archive