Domo on handling tranactional data and dimensional changes type 1 and type 2?
- By using Domo’s built-in data capturing methods.
- How does Domo handle transactional data?
- How does it handle duplicate data in regards to transactional data?
- How does Domo handle dimensional changes (Type1 &Type2)?
- Type 1 - old data is overwritten by new data
- Type 2 – multiple records are created to track data historically.
- We are concerned with creating daily snapshots of the data in our domo instances. By doing so, the data would start to grow exponentially.
- Is there a limit to the size of data we can store?
- Will there be performance issues?
- How does Domo handle transactional data?
Best Answers
-
Q1 - How does Domo handle transactional data?
A - I'm not sure exactly what is being asked here; all data is treated the same within Domo, and like any data within Domo, it is possible to adjust, combine, aggregate, and so on. Perhaps the answers to the rest of your questions will help clarify your question here.
Q1.1 - How does Domo handle duplicate data in regards to transactional data?
A - When importing data, Domo does not check for or remove duplicates. Duplicates can be removed from a dataset once it is in Domo using the data transformation tools within Domo (Magic ETL, Dataflow).
**Say “Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"5 -
Q3 - We are concerned with creating daily snapshots of the data in our domo instances. By doing so, the data would start to grow exponentially.
Q3.1 - Is there a limit to the size of data we can store?
A - There is no set limit for datasets.
Q3.2 - Will there be performance issues?
A - Theoretically, the bigger the dataset, the longer it will take to process. The impact from large datasets is felt differently depending on which tools are being used, and how they are being used (indexing in dataflows, MySql dataflow vs Magic ETL, etc.). There are customers who are able to effectively use datasets with billions of rows.
**Say “Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"4 -
The Domo append and replace functionality will work exactly the same on the Domo side for all connections. In other words, if "Append" is selected, Domo will take the new data supplied by the data source and append it to the existing dataset. If "Replace" is selected it will delete the old dataset and replace it with the new data supplied by the datasource.
The difference you are seeing could be a difference in the specifics that datasource; for example, a report from one datasource could return any records that have been updated in the last day, while a report from another datasource may return all the data each time it is called. Additionally, even two reports from the same datasource may be custom or configured to return the data differently.
This will vary so widely for each source system and customized reports within that system that we do not have any standard documentation covering those scenarios. When the datasource is created, the particulars for that datasource should be identified by whoever is setting up the connection. If there are questions about how the datasource is behaving, they can be directed to our support team.
**Say “Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"6
Answers
-
Would you want these questions answered separately or do you want all of the various answers combined together?
"When I have money, I buy books. When I have no money, I buy food." - Erasmus of Rotterdam0 -
Separately please
0 -
Q1 - How does Domo handle transactional data?
A - I'm not sure exactly what is being asked here; all data is treated the same within Domo, and like any data within Domo, it is possible to adjust, combine, aggregate, and so on. Perhaps the answers to the rest of your questions will help clarify your question here.
Q1.1 - How does Domo handle duplicate data in regards to transactional data?
A - When importing data, Domo does not check for or remove duplicates. Duplicates can be removed from a dataset once it is in Domo using the data transformation tools within Domo (Magic ETL, Dataflow).
**Say “Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"5 -
Q2 - How does Domo handle dimensional changes (Type 1 and Type 2)?
Type 1 - old data is overwritten by new data
Type 2 - multiple records are created to track data historically.
A - When uploading data to Domo, the user can choose to completely replace the dataset with new data or simply append the new data to the existing dataset.
Additionally, there are also ways to utilize dataflows to take the new data and append it conditionally (removing duplicates, etc) to a historical dataset that hold all the historical transactional data from that datasource.
**Say “Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"4 -
Thank you,
I just needed a documented response.
1 -
Q3 - We are concerned with creating daily snapshots of the data in our domo instances. By doing so, the data would start to grow exponentially.
Q3.1 - Is there a limit to the size of data we can store?
A - There is no set limit for datasets.
Q3.2 - Will there be performance issues?
A - Theoretically, the bigger the dataset, the longer it will take to process. The impact from large datasets is felt differently depending on which tools are being used, and how they are being used (indexing in dataflows, MySql dataflow vs Magic ETL, etc.). There are customers who are able to effectively use datasets with billions of rows.
**Say “Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"4 -
I am aware of the replace/append functionality. Can you provide any documentation on how each dataset behaves with replace/append. i have cases where two different data sources have the same settings set to append, one creates a daily copy of the data, the other only appends is data has changed. It is very difficult to configure datasets if there is no consistency.
0 -
Thank you
1 -
The Domo append and replace functionality will work exactly the same on the Domo side for all connections. In other words, if "Append" is selected, Domo will take the new data supplied by the data source and append it to the existing dataset. If "Replace" is selected it will delete the old dataset and replace it with the new data supplied by the datasource.
The difference you are seeing could be a difference in the specifics that datasource; for example, a report from one datasource could return any records that have been updated in the last day, while a report from another datasource may return all the data each time it is called. Additionally, even two reports from the same datasource may be custom or configured to return the data differently.
This will vary so widely for each source system and customized reports within that system that we do not have any standard documentation covering those scenarios. When the datasource is created, the particulars for that datasource should be identified by whoever is setting up the connection. If there are questions about how the datasource is behaving, they can be directed to our support team.
**Say “Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"6
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.8K Visualize
- 2.5K Charting
- 738 Beast Mode
- 56 App Studio
- 40 Variables
- 684 Automate
- 176 Apps
- 452 APIs & Domo Developer
- 46 Workflows
- 10 DomoAI
- 35 Predict
- 14 Jupyter Workspaces
- 21 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 123 Manage
- 120 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive