Connect and store dataset

hi, i need to connect to sap via odbc using workbench to get 530 millions data. these data must be stored in domo for data viz. they are range between 2023 until 2025 (still growing daily).
i came across multiple methods online but not sure if they do work/or if it's the best way: append? partition? upsert? workflow for automation? recursive dataflow? also to note that data in 2023 and 2024 are static, but data 2025 might still have updates, i just need the latest. if possible, could we also subtly detect if there're changes in 2023 and 2024 data, and replace accordingly? pls advise
Answers
-
I'd recommend trying using partitions so that if you have updated data or data being removed it'll take into account. You'll just need to make sure your entire data for each partition is being ingested otherwise you may lose data. If you have a specific key you can identify unique rows with then using an upsert may be an option as well.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0
Categories
- All Categories
- 2K Product Ideas
- 2K Ideas Exchange
- 1.6K Connect
- 1.3K Connectors
- 310 Workbench
- 7 Cloud Amplifier
- 9 Federated
- 3K Transform
- 114 SQL DataFlows
- 654 Datasets
- 2.2K Magic ETL
- 4K Visualize
- 2.5K Charting
- 796 Beast Mode
- 78 App Studio
- 44 Variables
- 757 Automate
- 188 Apps
- 480 APIs & Domo Developer
- 72 Workflows
- 17 DomoAI
- 40 Predict
- 17 Jupyter Workspaces
- 23 R & Python Tiles
- 408 Distribute
- 119 Domo Everywhere
- 279 Scheduled Reports
- 10 Software Integrations
- 141 Manage
- 137 Governance & Security
- 8 Domo Community Gallery
- 47 Product Releases
- 12 Domo University
- 5.4K Community Forums
- 41 Getting Started
- 31 Community Member Introductions
- 114 Community Announcements
- 4.8K Archive