Domo on handling tranactional data and dimensional changes type 1 and type 2?

  1. By using Domo’s built-in data capturing methods.
    1. How does Domo handle transactional data?
      1. How does it handle duplicate data in regards to transactional data?
    2. How does Domo handle dimensional changes (Type1 &Type2)?
      1. Type 1 - old data is overwritten by new data
      2. Type 2 – multiple records are created to track data historically.
    3. We are concerned with creating daily snapshots of the data in our domo instances. By doing so, the data would start to grow exponentially.
      1. Is there a limit to the size of data we can store?
      2. Will there be performance issues?

Best Answers

  • jaromp
    jaromp Member
    Answer ✓

    Q3 - We are concerned with creating daily snapshots of the data in our domo instances. By doing so, the data would start to grow exponentially.

     

    Q3.1 - Is there a limit to the size of data we can store?

    A - There is no set limit for datasets.

     

    Q3.2 - Will there be performance issues?

    A - Theoretically, the bigger the dataset, the longer it will take to process. The impact from large datasets is felt differently depending on which tools are being used, and how they are being used (indexing in dataflows, MySql dataflow vs Magic ETL, etc.). There are customers who are able to effectively use datasets with billions of rows.

Answers

  • nalbright
    nalbright Contributor

    Would you want these questions answered separately or do you want all of the various answers combined together?

    "When I have money, I buy books. When I have no money, I buy food." - Erasmus of Rotterdam
  • Separately please

  • Q2 - How does Domo handle dimensional changes (Type 1 and Type 2)?

    Type 1 - old data is overwritten by new data

    Type 2 - multiple records are created to track data historically.

     

    A - When uploading data to Domo, the user can choose to completely replace the dataset with new data or simply append the new data to the existing dataset.

    Additionally, there are also ways to utilize dataflows to take the new data and append it conditionally (removing duplicates, etc) to a historical dataset that hold all the historical transactional data from that datasource.

  • Thank you,

     

    I just needed a documented response.

  • jaromp
    jaromp Member
    Answer ✓

    Q3 - We are concerned with creating daily snapshots of the data in our domo instances. By doing so, the data would start to grow exponentially.

     

    Q3.1 - Is there a limit to the size of data we can store?

    A - There is no set limit for datasets.

     

    Q3.2 - Will there be performance issues?

    A - Theoretically, the bigger the dataset, the longer it will take to process. The impact from large datasets is felt differently depending on which tools are being used, and how they are being used (indexing in dataflows, MySql dataflow vs Magic ETL, etc.). There are customers who are able to effectively use datasets with billions of rows.

  • I am aware of the replace/append functionality. Can you provide any documentation on how each dataset behaves with replace/append. i have cases where two different data sources have the same settings set to append, one creates a daily copy of the data, the other only appends is data has changed. It is very difficult to configure datasets if there is no consistency.

  • Thank you