Unable to load this DataSet
Comments
-
I did get this once when I deleted a data set that was created from a Magic ETL upstream and then had created a similar, but different one with the same name. Ultimately I had to recreate the downstream Magic ETL and delete the one that could not load a data set. ultimately i think i corrupted it a bit by making the changes and deleting it. Doesnt solve your problem, but you may just have to recreate to solve it fastest way possible.Dojo Community Member
** Please like responses by clicking on the thumbs up
** Please Accept / check the answer that solved your problem / answered your question.0 -
Thanks for following up! I think our issues were different. Recreating the dataset didn't help me, but I ended up figuring out -- the issue was that the dataset name was too long. That's something Domo should consider -- rather than saying the Dataset couldn't be loaded, it would be helpful to know that it's because the name is too long.
Thanks for the help!
Tomer
3 -
Yes, agreed. I just encountered this problem as well and all it told me was "Unable to load this dataset". Like you said, it would be helpful to know why. Thankfully I found your suggestion, but had I not, I would have been very frustrated as to why my dataset was not loading.
0 -
Hi,
I am also facing the same error "Unable to load this DataSet" when trying to write an SQL. Its not becaus eof name length coz I tried shortening the length but still it is not working. On the otherhand another dataset with a longer name is working. Can anyone help me with this?
Thanks,
Lakshmi
0 -
I hope this helps -
https://knowledge.domo.com/?cid=troubleshooting
I am getting an "Unable to load this DataSet" error when attempting to load a DataSet into a MySQL DataFlow
This error is usually caused by one of the following:
You may have one or more column header names that exceed the 64-character limit.
You may have row level data in one or more columns that exceeds the 1024-character limit.
If the problem is in the column header names, you should be able to see this by previewing the DataSet.
To find out if you have row level data that exceeds the limit, you will have to do more research. Look at the data previews and look to see if any columns appear wide (as if there were sentences in the rows).
You can also pull the DataSet into a Redshift DataFlow then use the LEN function to find the length of the data or the MAXfunction to find the largest character amount for each column. For more information, see Creating an SQL DataFlow.
Note: Magic does not have a header and row limit as MYSQL and Redshift do. Cards will load columns with more than 1024 characters but will automatically truncate the data in a given row to the limit.0 -
Thank you. Unfortunately, No. I have another dataset which is a superset to the problamatic dataset and bigger data set isnt creating issues but subset is creating a problem. If there is anything else I can check on, please share.
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 102 SQL DataFlows
- 626 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 753 Beast Mode
- 61 App Studio
- 41 Variables
- 692 Automate
- 177 Apps
- 456 APIs & Domo Developer
- 49 Workflows
- 10 DomoAI
- 38 Predict
- 16 Jupyter Workspaces
- 22 R & Python Tiles
- 398 Distribute
- 115 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 130 Manage
- 127 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 11 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 110 Community Announcements
- 4.8K Archive