Dataset Upload Architecture
We have our Data Warehouse(Postgres) and out of it we have Datasets which we define(d). We are uploading them to Domo via Using the CLI.
This data is in tables in our Data Warehouse. What is the best architectural approach to do that?
Would it make sense to have table returning functions in the Database System, which we call and retrieve the data. When yes, how could we call this functions and transport the returned data directly to our DOMO datasets in a scheduled way. In general what can be a good approach to handle also new reporting requests?
Best Answer
-
Hello,
You can separate the datasets into departments using SSAS or tables (more complex), for example: Sales, HR, Operations... and then combine then in Domo by using ETL, MySQL or Redshift.
1-SSAS option: You can have multiple cubes divide by departments and then scheduled a job that will updated the cubes periodically depending on your business requirement then using Workbench upload the cubes to Domo by schedule and that way you always have live data.
2-Tables option: If you choose the table option, it is more complicated because you will need an Update, insert and Delete Process to updated your tables by using SSIS or so, also this approach will also consume more resources.
But once you have defined the update process of those tables it is just the matter of upload the data into Domo and then combine them using ETL, MySQL or Redshift and finally connect the reports to those dataflows.
1
Answers
-
Hi,
There are a lot of options within Domo and depending on how you want to present the data it may affect the way you approach this. My suggestion would be to reach out to your Customer Success Manager and discuss the options in detail.
Jarvis
0 -
Could you list maybe roughly some options? Thanks also for the hint. We will contact our customer success manager, I just wanted to have also a comparison to other tech stacks. I was used to to the things in a Microsoft environment with SSRS.
0 -
Hello,
You can separate the datasets into departments using SSAS or tables (more complex), for example: Sales, HR, Operations... and then combine then in Domo by using ETL, MySQL or Redshift.
1-SSAS option: You can have multiple cubes divide by departments and then scheduled a job that will updated the cubes periodically depending on your business requirement then using Workbench upload the cubes to Domo by schedule and that way you always have live data.
2-Tables option: If you choose the table option, it is more complicated because you will need an Update, insert and Delete Process to updated your tables by using SSIS or so, also this approach will also consume more resources.
But once you have defined the update process of those tables it is just the matter of upload the data into Domo and then combine them using ETL, MySQL or Redshift and finally connect the reports to those dataflows.
1
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 102 SQL DataFlows
- 626 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 754 Beast Mode
- 61 App Studio
- 41 Variables
- 693 Automate
- 178 Apps
- 456 APIs & Domo Developer
- 49 Workflows
- 10 DomoAI
- 38 Predict
- 16 Jupyter Workspaces
- 22 R & Python Tiles
- 398 Distribute
- 115 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 130 Manage
- 127 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 11 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 110 Community Announcements
- 4.8K Archive