Snowflake Connector Column Limit?

We're currently testing our Snowflake-Domo (key pair) connector and running into issues when querying wide tables from our Snowflake instance.

The table in question is ~12M rows and 400+ columns. Currently we import this into Domo with the Amazon S3 connector. This works, though can be slow (2-6 hours) to complete.

We've established the Snowflake Key-Pair connector successfully, but when we query this table we get the error "Failed to execute import successfully." I've trimmed the SELECT to only include ~250 columns, but received the same error.

I was able to successfully import with 100 columns (in 18m 44s), so it would appear that the range for columns that we can import is between 100 and 250. While a large # of columns isn't ideal (and we're actively trimming them down), this is the first time we've experienced this issue.

Is there a known limit to the # of columns that can be imported via Snowflake, or is there a better solution or configuration we should attempt for wide tables?

Tagged:

Answers

  • Tried 150 columns and it successfully completed after 32m 18s.

    Tried 200 columns and is failed after 47m 26s.

    Sounds like the sweet spot is somewhere in between.

  • Can you break it up into two datasets an then join it back together in Magic ETL?

    **Check out my Domo Tips & Tricks Videos

    **Make sure to <3 any users posts that helped you.
    **Please mark as accepted the ones who solved your issue.
  • That's an option we'd explore as a last case scenario, though I'd be worried about runtime/job success when joining 150+150+100 column tables at 12M rows a piece.

    Short-term trying to figure out a column limit and going from there. May break these into use-case specific datasets from a single source anyway.