We're currently testing our Snowflake-Domo (key pair) connector and running into issues when querying wide tables from our Snowflake instance.
The table in question is ~12M rows and 400+ columns. Currently we import this into Domo with the Amazon S3 connector. This works, though can be slow (2-6 hours) to complete.
We've established the Snowflake Key-Pair connector successfully, but when we query this table we get the error "Failed to execute import successfully." I've trimmed the SELECT to only include ~250 columns, but received the same error.
I was able to successfully import with 100 columns (in 18m 44s), so it would appear that the range for columns that we can import is between 100 and 250. While a large # of columns isn't ideal (and we're actively trimming them down), this is the first time we've experienced this issue.
Is there a known limit to the # of columns that can be imported via Snowflake, or is there a better solution or configuration we should attempt for wide tables?