-
Problem with column lenght limit
Hello. I'm using an Athena Connector to retrieve some data from Redshift and one of the column I'm receiving is an array<struct> that I would need to import as it is in Domo. However, I see that this gets converted to a string (and I'm fine with that), but then the string gets truncated after 1024 character, and this is a…
-
Connect to SFTP - Select ALL files in root directory
Hello, I'm here to ask you if there is any possibility with the built-in connectors to connect to an SFTP server and get the data coming from ALL .csv files in the root directory (supposing all files have the same schema). This is because I'm receiving new files on a daily/weekly basis and I would definitely want to…
-
Programmatically run a dataflow/dataset
Hello, I would like your help in order to understand how to run a dataset (or a dataflow) based on a trigger outside Domo. Let's say, for instance, I have a python script that check a condition on my database and if the condition is met, it should trigger the execution of two mySql Connector in Domo. Would this be possible…
-
Import huge amount of data from mySQL - is it possible to partition a datasource?
Good morning, I'm writing this because it seems I cannot solve properly my problem on my own and I need your help. So, i have a huge table in mysql (50 million rows, 100 columns) and we used to import it entirely every day to Domo to have all the new data. Of course, this is getting slower and slower and we cannot manage…
-
S3 Connector - Copy Command
Hello, I'm here to ask you if it possible somehow to solve the following scenario. In an external S3 bucket I have hundreds of files for a single table (one file per each day, that is how we partition the table). The first question I want to ask is: it is possible to import this table by performing something similar to…