Moving files from SFTP site to S3

Hello everyone,

Has anyone used Domo to move files from SFTP to S3 while keeping the original file format (pipe delimited csv). We receive a daily file from our vendor that we want to:

  1. Archive the daily file
  2. Load from stage/archive location to our Snowflake warehouse (future state)

We currently use another tool to do this but that will not be a solution in the near future. Hoping to use Domo to replicate the pattern.

Best Answer

  • ArborRose
    ArborRose Coach
    edited August 8 Answer ✓

    You could setup the SFTP connector or use Workbench to pull from the SFTP. Make sure the file is fetched in CSV. Then use Python or Domo's AWS S3 writeback connector to transfer to S3. Use a dataflow to do any processing or transformations. For the archive, write a step that trigger the S3 Writeback to upload to the desired archive location.

    And for Snowflake, I believe there is a Domo Snowflake connector. You can integrate using a new dataflow or workflow that reads the archived file from S2 and pushes it to Snowflake.

    ** Was this post helpful? Click Agree or Like below. **
    ** Did this solve your problem? Accept it as a solution! **

Answers

  • ArborRose
    ArborRose Coach
    edited August 8 Answer ✓

    You could setup the SFTP connector or use Workbench to pull from the SFTP. Make sure the file is fetched in CSV. Then use Python or Domo's AWS S3 writeback connector to transfer to S3. Use a dataflow to do any processing or transformations. For the archive, write a step that trigger the S3 Writeback to upload to the desired archive location.

    And for Snowflake, I believe there is a Domo Snowflake connector. You can integrate using a new dataflow or workflow that reads the archived file from S2 and pushes it to Snowflake.

    ** Was this post helpful? Click Agree or Like below. **
    ** Did this solve your problem? Accept it as a solution! **