Archive

Archive

How to overcome column length limits?

I have a dataset where one column is open text with character counts typically over 2000 or so. When I load this dataset into Domo and try to parse through that field with text functions (e.g. substring, instr, etc), I find that the values in this field are being truncated to a limit of about 1024 bytes/characters. When I export the data out to CSV, the full data is there, but when I use DataFlows (Magic and MySQL) or if I use Beast Modes, I am stuck with the 1024 limit again. 

 

Any ideas on how to expose the full dataset? I know the varchar type can support much more than 1024 bytes, I even tried an alter table statement to set the column type to varchar(10000) for example, but still returned the same results as before. 

Welcome!

It looks like you're new here. Members get access to exclusive content, events, rewards, and more. Sign in or register to get started.
Sign In

Comments

  • Member

    Does anyone know how to help on this issue?

  • I've ran into this issue as well.  Very long strings in MySQL dataflow get truncated to 1024 characters, Can someone from Domo comment?

     

    Thank you!

  • Can somebody help answer?

  • Member

    Change the name of the column in magic etl first. Then use that etl dataset to code your sql dataflow.

Welcome!

It looks like you're new here. Members get access to exclusive content, events, rewards, and more. Sign in or register to get started.
Sign In