How to overcome column length limits?

I have a dataset where one column is open text with character counts typically over 2000 or so. When I load this dataset into Domo and try to parse through that field with text functions (e.g. substring, instr, etc), I find that the values in this field are being truncated to a limit of about 1024 bytes/characters. When I export the data out to CSV, the full data is there, but when I use DataFlows (Magic and MySQL) or if I use Beast Modes, I am stuck with the 1024 limit again. 

 

Any ideas on how to expose the full dataset? I know the varchar type can support much more than 1024 bytes, I even tried an alter table statement to set the column type to varchar(10000) for example, but still returned the same results as before. 

Comments

  • KaLin
    KaLin Member

    Does anyone know how to help on this issue?

  • I've ran into this issue as well.  Very long strings in MySQL dataflow get truncated to 1024 characters, Can someone from Domo comment?

     

    Thank you!

  • Can somebody help answer?

  • jsr
    jsr Member

    Change the name of the column in magic etl first. Then use that etl dataset to code your sql dataflow.