Cleaning date column which has text

Hoping someone can help me with this. I want to ingest csv files from our S3 bucket where the first column is the date column. That column also has text in it:

I am also getting this error "Failed to parse text '2/1/23' from column 'Date' as type 'Date'" from one of my files. I've tried fixing the file itself but it's not working.

I tried writing a beastmode to clean this data but not sure how to take care of both issues at once.

Thanks!

Best Answer

  • MarkSnodgrass
    Answer ✓

    One of the easiest ways to do this in Magic ETL is to click on your input dataset and click on the data handling section. You can then select what data type you want to change it to. Be sure to click on the wheel cog next to it and in the Bad Values drop down, select Null. This will instruct the ETL to convert the data to date field, when it can, and replace bad values with null.

    Another option is to use the Formula tile and use TRY_CAST(yourfield as DATE), which will do the same thing.

    **Check out my Domo Tips & Tricks Videos

    **Make sure to <3 any users posts that helped you.
    **Please mark as accepted the ones who solved your issue.

Answers

  • MarkSnodgrass
    Answer ✓

    One of the easiest ways to do this in Magic ETL is to click on your input dataset and click on the data handling section. You can then select what data type you want to change it to. Be sure to click on the wheel cog next to it and in the Bad Values drop down, select Null. This will instruct the ETL to convert the data to date field, when it can, and replace bad values with null.

    Another option is to use the Formula tile and use TRY_CAST(yourfield as DATE), which will do the same thing.

    **Check out my Domo Tips & Tricks Videos

    **Make sure to <3 any users posts that helped you.
    **Please mark as accepted the ones who solved your issue.
  • Here is a video that walks you through it in more detail.

    **Check out my Domo Tips & Tricks Videos

    **Make sure to <3 any users posts that helped you.
    **Please mark as accepted the ones who solved your issue.