Dataset Upload Architecture
We have our Data Warehouse(Postgres) and out of it we have Datasets which we define(d). We are uploading them to Domo via Using the CLI.
This data is in tables in our Data Warehouse. What is the best architectural approach to do that?
Would it make sense to have table returning functions in the Database System, which we call and retrieve the data. When yes, how could we call this functions and transport the returned data directly to our DOMO datasets in a scheduled way. In general what can be a good approach to handle also new reporting requests?
Best Answer
-
Hello,
You can separate the datasets into departments using SSAS or tables (more complex), for example: Sales, HR, Operations... and then combine then in Domo by using ETL, MySQL or Redshift.
1-SSAS option: You can have multiple cubes divide by departments and then scheduled a job that will updated the cubes periodically depending on your business requirement then using Workbench upload the cubes to Domo by schedule and that way you always have live data.
2-Tables option: If you choose the table option, it is more complicated because you will need an Update, insert and Delete Process to updated your tables by using SSIS or so, also this approach will also consume more resources.
But once you have defined the update process of those tables it is just the matter of upload the data into Domo and then combine them using ETL, MySQL or Redshift and finally connect the reports to those dataflows.
1
Answers
-
Hi,
There are a lot of options within Domo and depending on how you want to present the data it may affect the way you approach this. My suggestion would be to reach out to your Customer Success Manager and discuss the options in detail.
Jarvis
0 -
Could you list maybe roughly some options? Thanks also for the hint. We will contact our customer success manager, I just wanted to have also a comparison to other tech stacks. I was used to to the things in a Microsoft environment with SSRS.
0 -
Hello,
You can separate the datasets into departments using SSAS or tables (more complex), for example: Sales, HR, Operations... and then combine then in Domo by using ETL, MySQL or Redshift.
1-SSAS option: You can have multiple cubes divide by departments and then scheduled a job that will updated the cubes periodically depending on your business requirement then using Workbench upload the cubes to Domo by schedule and that way you always have live data.
2-Tables option: If you choose the table option, it is more complicated because you will need an Update, insert and Delete Process to updated your tables by using SSIS or so, also this approach will also consume more resources.
But once you have defined the update process of those tables it is just the matter of upload the data into Domo and then combine them using ETL, MySQL or Redshift and finally connect the reports to those dataflows.
1
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 738 Beast Mode
- 57 App Studio
- 40 Variables
- 685 Automate
- 176 Apps
- 452 APIs & Domo Developer
- 47 Workflows
- 10 DomoAI
- 36 Predict
- 15 Jupyter Workspaces
- 21 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 124 Manage
- 121 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive