Connect and store dataset

hi, i need to connect to sap via odbc using workbench to get 530 millions data. these data must be stored in domo for data viz. they are range between 2023 until 2025 (still growing daily).
i came across multiple methods online but not sure if they do work/or if it's the best way: append? partition? upsert? workflow for automation? recursive dataflow? also to note that data in 2023 and 2024 are static, but data 2025 might still have updates, i just need the latest. if possible, could we also subtly detect if there're changes in 2023 and 2024 data, and replace accordingly? pls advise
Answers
-
I'd recommend trying using partitions so that if you have updated data or data being removed it'll take into account. You'll just need to make sure your entire data for each partition is being ingested otherwise you may lose data. If you have a specific key you can identify unique rows with then using an upsert may be an option as well.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0
Categories
- All Categories
- 2K Product Ideas
- 2K Ideas Exchange
- 1.6K Connect
- 1.3K Connectors
- 311 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 3.8K Transform
- 656 Datasets
- 115 SQL DataFlows
- 2.2K Magic ETL
- 813 Beast Mode
- 3.3K Visualize
- 2.5K Charting
- 81 App Studio
- 45 Variables
- 771 Automate
- 190 Apps
- 481 APIs & Domo Developer
- 77 Workflows
- 23 Code Engine
- 36 AI and Machine Learning
- 19 AI Chat
- AI Playground
- AI Projects and Models
- 17 Jupyter Workspaces
- 410 Distribute
- 120 Domo Everywhere
- 280 Scheduled Reports
- 10 Software Integrations
- 142 Manage
- 138 Governance & Security
- 8 Domo Community Gallery
- 48 Product Releases
- 12 Domo University
- 5.4K Community Forums
- 41 Getting Started
- 31 Community Member Introductions
- 114 Community Announcements
- 4.8K Archive