Dataset Upload Architecture

user08857
user08857 Member
edited March 2023 in Scheduled Reports

We have our Data Warehouse(Postgres) and out of it we have Datasets which we define(d). We are uploading them to Domo via Using the CLI.

This data is in tables in our Data Warehouse. What is the best architectural approach to do that? 

Would it make sense to have table returning functions in the Database System, which we call and retrieve the data. When yes, how could we call this functions and transport the returned data directly to our DOMO datasets in a scheduled way. In general what can be a good approach to handle also new reporting requests?

Best Answer

  • jcsrt
    jcsrt Member
    Answer ✓

    Hello,

     

    You can separate the datasets into departments using SSAS or tables (more complex), for example: Sales, HR, Operations... and then combine then in Domo by using ETL, MySQL or Redshift.

     

    1-SSAS option: You can have multiple cubes divide by departments  and then scheduled a job that will updated the cubes periodically depending on your business requirement then using Workbench upload the cubes to Domo by schedule and that way you always have live data.

     

    2-Tables option: If you choose the table option, it is more complicated because you will need an Update, insert and Delete Process to updated your tables by using SSIS or so, also this approach will also consume more resources.

    But once you have defined the update process of those tables it is just the matter of upload the data into Domo and then combine them using ETL, MySQL or Redshift and finally connect the reports to those dataflows.

Answers

  • Jarvis
    Jarvis Domo Employee

    Hi,

     

    There are a lot of options within Domo and depending on how you want to present the data it may affect the way you approach this. My suggestion would be to reach out to your Customer Success Manager and discuss the options in detail.

     

    Jarvis

  • Could you list maybe roughly some options? Thanks also for the hint. We will contact our customer success manager, I just wanted to have also a comparison to other tech stacks. I was used to to the things in a Microsoft environment with SSRS. 

  • jcsrt
    jcsrt Member
    Answer ✓

    Hello,

     

    You can separate the datasets into departments using SSAS or tables (more complex), for example: Sales, HR, Operations... and then combine then in Domo by using ETL, MySQL or Redshift.

     

    1-SSAS option: You can have multiple cubes divide by departments  and then scheduled a job that will updated the cubes periodically depending on your business requirement then using Workbench upload the cubes to Domo by schedule and that way you always have live data.

     

    2-Tables option: If you choose the table option, it is more complicated because you will need an Update, insert and Delete Process to updated your tables by using SSIS or so, also this approach will also consume more resources.

    But once you have defined the update process of those tables it is just the matter of upload the data into Domo and then combine them using ETL, MySQL or Redshift and finally connect the reports to those dataflows.

This discussion has been closed.