Does anyone know the limitations of dataset size and performance impact?
I have been tasked with creating a daily snapshot of our quickstart datasets. I have reached out to our consultants and domo support, they basically suggest that we consider making smaller snapshots. Coming from a MsSQL environment, there have been situations in which day to day transactions are need to track year to year changes as of specific dates. On top of that i can build a weekly/monthly snapshot if i have the daily. Which brings me to my question, does or has anyone ran into a size limitation? If so what was the size of the dataset in which perfomance or reports are impacted?
Comments
-
Standard dataflows are based off of MySql. That performs reasonably well up to about 5 million rows. After that size, we can switch to Redshift SQL (internal to Domo consultants) which handles larger datasets better. I've seen redshift dataflows handle up to 50 million rows reasonably well. MagicETL is built on a separate engine that is also optimized to handle very large datasets. If performance starts to be impacted, we have an ETL team here that could give some advice.
How many rows of data a day in the snapshot are you anticipating?
I work for Domo.
**Say "Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem as "Accepted Solution"3
Categories
- 10.5K All Categories
- 3 Connect
- 913 Connectors
- 250 Workbench
- 458 Transform
- 1.7K Magic ETL
- 69 SQL DataFlows
- 476 Datasets
- 183 Visualize
- 249 Beast Mode
- 2.1K Charting
- 11 Variables
- 16 Automate
- 354 APIs & Domo Developer
- 88 Apps
- 3 Workflows
- 20 Predict
- 5 Jupyter Workspaces
- 15 R & Python Tiles
- 245 Distribute
- 62 Domo Everywhere
- 242 Scheduled Reports
- 20 Manage
- 41 Governance & Security
- 168 Product Ideas
- 1.2K Ideas Exchange
- 9 Community Forums
- 27 Getting Started
- 14 Community Member Introductions
- 55 Community News
- 4.5K Archive