Does anyone know the limitations of dataset size and performance impact?
I have been tasked with creating a daily snapshot of our quickstart datasets. I have reached out to our consultants and domo support, they basically suggest that we consider making smaller snapshots. Coming from a MsSQL environment, there have been situations in which day to day transactions are need to track year to year changes as of specific dates. On top of that i can build a weekly/monthly snapshot if i have the daily. Which brings me to my question, does or has anyone ran into a size limitation? If so what was the size of the dataset in which perfomance or reports are impacted?
Comments
-
Standard dataflows are based off of MySql. That performs reasonably well up to about 5 million rows. After that size, we can switch to Redshift SQL (internal to Domo consultants) which handles larger datasets better. I've seen redshift dataflows handle up to 50 million rows reasonably well. MagicETL is built on a separate engine that is also optimized to handle very large datasets. If performance starts to be impacted, we have an ETL team here that could give some advice.
How many rows of data a day in the snapshot are you anticipating?
I work for Domo.
**Say "Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem as "Accepted Solution"3
Categories
- All Categories
- 1.1K Product Ideas
- 1.1K Ideas Exchange
- 1.2K Connect
- 969 Connectors
- 257 Workbench
- Cloud Amplifier
- 1 Federated
- 2.4K Transform
- 76 SQL DataFlows
- 501 Datasets
- 1.8K Magic ETL
- 2.7K Visualize
- 2.2K Charting
- 375 Beast Mode
- 20 Variables
- 485 Automate
- 103 Apps
- 378 APIs & Domo Developer
- 6 Workflows
- 22 Predict
- 6 Jupyter Workspaces
- 16 R & Python Tiles
- 316 Distribute
- 64 Domo Everywhere
- 252 Scheduled Reports
- 59 Manage
- 59 Governance & Security
- 1 Product Release Questions
- 5K Community Forums
- 37 Getting Started
- 23 Community Member Introductions
- 63 Community Announcements
- 4.8K Archive