Unable to load this DataSet
Comments
-
I did get this once when I deleted a data set that was created from a Magic ETL upstream and then had created a similar, but different one with the same name. Ultimately I had to recreate the downstream Magic ETL and delete the one that could not load a data set. ultimately i think i corrupted it a bit by making the changes and deleting it. Doesnt solve your problem, but you may just have to recreate to solve it fastest way possible.Dojo Community Member
** Please like responses by clicking on the thumbs up
** Please Accept / check the answer that solved your problem / answered your question.0 -
Thanks for following up! I think our issues were different. Recreating the dataset didn't help me, but I ended up figuring out -- the issue was that the dataset name was too long. That's something Domo should consider -- rather than saying the Dataset couldn't be loaded, it would be helpful to know that it's because the name is too long.
Thanks for the help!
Tomer
3 -
Yes, agreed. I just encountered this problem as well and all it told me was "Unable to load this dataset". Like you said, it would be helpful to know why. Thankfully I found your suggestion, but had I not, I would have been very frustrated as to why my dataset was not loading.
0 -
Hi,
I am also facing the same error "Unable to load this DataSet" when trying to write an SQL. Its not becaus eof name length coz I tried shortening the length but still it is not working. On the otherhand another dataset with a longer name is working. Can anyone help me with this?
Thanks,
Lakshmi
0 -
I hope this helps -
https://knowledge.domo.com/?cid=troubleshooting
I am getting an "Unable to load this DataSet" error when attempting to load a DataSet into a MySQL DataFlow
This error is usually caused by one of the following:
You may have one or more column header names that exceed the 64-character limit.
You may have row level data in one or more columns that exceeds the 1024-character limit.
If the problem is in the column header names, you should be able to see this by previewing the DataSet.
To find out if you have row level data that exceeds the limit, you will have to do more research. Look at the data previews and look to see if any columns appear wide (as if there were sentences in the rows).
You can also pull the DataSet into a Redshift DataFlow then use the LEN function to find the length of the data or the MAXfunction to find the largest character amount for each column. For more information, see Creating an SQL DataFlow.
Note: Magic does not have a header and row limit as MYSQL and Redshift do. Cards will load columns with more than 1024 characters but will automatically truncate the data in a given row to the limit.0 -
Thank you. Unfortunately, No. I have another dataset which is a superset to the problamatic dataset and bigger data set isnt creating issues but subset is creating a problem. If there is anything else I can check on, please share.
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 738 Beast Mode
- 57 App Studio
- 40 Variables
- 685 Automate
- 176 Apps
- 452 APIs & Domo Developer
- 47 Workflows
- 10 DomoAI
- 36 Predict
- 15 Jupyter Workspaces
- 21 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 124 Manage
- 121 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive