How to overcome column length limits?
I have a dataset where one column is open text with character counts typically over 2000 or so. When I load this dataset into Domo and try to parse through that field with text functions (e.g. substring, instr, etc), I find that the values in this field are being truncated to a limit of about 1024 bytes/characters. When I export the data out to CSV, the full data is there, but when I use DataFlows (Magic and MySQL) or if I use Beast Modes, I am stuck with the 1024 limit again.
Any ideas on how to expose the full dataset? I know the varchar type can support much more than 1024 bytes, I even tried an alter table statement to set the column type to varchar(10000) for example, but still returned the same results as before.
Comments
-
Does anyone know how to help on this issue?
1 -
I've ran into this issue as well. Very long strings in MySQL dataflow get truncated to 1024 characters, Can someone from Domo comment?
Thank you!
0 -
Can somebody help answer?
0 -
Change the name of the column in magic etl first. Then use that etl dataset to code your sql dataflow.
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 738 Beast Mode
- 57 App Studio
- 40 Variables
- 685 Automate
- 176 Apps
- 452 APIs & Domo Developer
- 47 Workflows
- 10 DomoAI
- 36 Predict
- 15 Jupyter Workspaces
- 21 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 124 Manage
- 121 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive