Period over period on one row
Hi,
For this particular use case I have data like below
Date | Value
I need to transform this to
Date | Value | Date LY | Value LY
I could join the source data up like this to itself when pulling the data into domo via SQL but would rather do this in domo.
I attempted to use the period over period dimension stuff made by @GrantSmith (which is great and we are using) but the beastmodes look at 1 row at a time and for this particular use case a calculated value I am making needs access to this year data so needs to be on the same row.
I have looked at a MySQL dataflow but you can't join a dataset to itself and the same with the Views.
Would I have to copy the dataset multiple times?
Any ideas?
Thanks
Best Answer
-
Since you're using the PoP dimension you can create a new ETL with the PoP dataset as the input, have two filters `Period Type` = 'Current' and `Period Type` = 'Last Year' then do a join of these two fiilters based on the report date. That will combine them together on the same row for your output dataset.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0
Answers
-
Since you're using the PoP dimension you can create a new ETL with the PoP dataset as the input, have two filters `Period Type` = 'Current' and `Period Type` = 'Last Year' then do a join of these two fiilters based on the report date. That will combine them together on the same row for your output dataset.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
@GrantSmith Thanks, that is a great help.
My dataset including the dimensions is about 80 million rows as I have added quite a few more date comparisons to the ETL you provided. I may need a simpler version as the ETL to filter the dataset into 2 and join back together has been running for ages.
0 -
You could pre-filter your dataset using two dataset views (period type = current and period type = last year) and then use those as an input into your ETL. That'll do the filtering in the adrenaline layer (essentially in memory) instead of having to transfer all 80M rows you have over the to ETL process and transferring them all back.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
@GrantSmith thanks for the tip.
In the end I just did it in SQL from the connector as I had other columns involved with some outer joins.
0
Categories
- All Categories
- 1.7K Product Ideas
- 1.7K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 292 Workbench
- 4 Cloud Amplifier
- 8 Federated
- 2.8K Transform
- 95 SQL DataFlows
- 602 Datasets
- 2.1K Magic ETL
- 3.7K Visualize
- 2.4K Charting
- 693 Beast Mode
- 43 App Studio
- 39 Variables
- 658 Automate
- 170 Apps
- 441 APIs & Domo Developer
- 42 Workflows
- 5 DomoAI
- 32 Predict
- 12 Jupyter Workspaces
- 20 R & Python Tiles
- 386 Distribute
- 111 Domo Everywhere
- 269 Scheduled Reports
- 6 Software Integrations
- 113 Manage
- 110 Governance & Security
- 8 Domo University
- 30 Product Releases
- Community Forums
- 39 Getting Started
- 29 Community Member Introductions
- 98 Community Announcements
- Domo Community Gallery
- 4.8K Archive