Period over period on one row
Hi,
For this particular use case I have data like below
Date | Value
I need to transform this to
Date | Value | Date LY | Value LY
I could join the source data up like this to itself when pulling the data into domo via SQL but would rather do this in domo.
I attempted to use the period over period dimension stuff made by @GrantSmith (which is great and we are using) but the beastmodes look at 1 row at a time and for this particular use case a calculated value I am making needs access to this year data so needs to be on the same row.
I have looked at a MySQL dataflow but you can't join a dataset to itself and the same with the Views.
Would I have to copy the dataset multiple times?
Any ideas?
Thanks
Best Answer
-
Since you're using the PoP dimension you can create a new ETL with the PoP dataset as the input, have two filters `Period Type` = 'Current' and `Period Type` = 'Last Year' then do a join of these two fiilters based on the report date. That will combine them together on the same row for your output dataset.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0
Answers
-
Since you're using the PoP dimension you can create a new ETL with the PoP dataset as the input, have two filters `Period Type` = 'Current' and `Period Type` = 'Last Year' then do a join of these two fiilters based on the report date. That will combine them together on the same row for your output dataset.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
@GrantSmith Thanks, that is a great help.
My dataset including the dimensions is about 80 million rows as I have added quite a few more date comparisons to the ETL you provided. I may need a simpler version as the ETL to filter the dataset into 2 and join back together has been running for ages.
0 -
You could pre-filter your dataset using two dataset views (period type = current and period type = last year) and then use those as an input into your ETL. That'll do the filtering in the adrenaline layer (essentially in memory) instead of having to transfer all 80M rows you have over the to ETL process and transferring them all back.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
@GrantSmith thanks for the tip.
In the end I just did it in SQL from the connector as I had other columns involved with some outer joins.
0
Categories
- 10.6K All Categories
- 8 Connect
- 918 Connectors
- 250 Workbench
- 477 Transform
- 1.8K Magic ETL
- 69 SQL DataFlows
- 478 Datasets
- 218 Visualize
- 260 Beast Mode
- 2.1K Charting
- 12 Variables
- 19 Automate
- 356 APIs & Domo Developer
- 89 Apps
- 3 Workflows
- 20 Predict
- 5 Jupyter Workspaces
- 15 R & Python Tiles
- 249 Distribute
- 65 Domo Everywhere
- 243 Scheduled Reports
- 21 Manage
- 42 Governance & Security
- 191 Product Ideas
- 1.2K Ideas Exchange
- 11 Community Forums
- 27 Getting Started
- 14 Community Member Introductions
- 55 Community News
- 4.5K Archive