Loading from S3 into Workbench
Hey all,
Does anyone have experience connecting Workbench to S3? I'm having trouble figuring out where to put the credentials etc to be able to pull the files.
Any thoughts/suggestions?
Best Answer
-
That makes sense why I was confused, I thought you might have something custom I wasn't aware of. how it worked. Thanks for the update.
**Make sure to like any users posts that helped you and accept the ones who solved your issue.**0
Answers
-
I haven't attempted this before but just created an S3 bucket to test and got it working without much trouble.
I created a bucket called test-bucket18 and put the file into a folder called Domo and named the file Test_Domo_S3.csv
When creating it you have to use the keys you get when setting up a user (access + secret). Here are the settings in Domo.
Credentials:
ACCESS KEY: QEWRWGWLKGLWDKFGLKWD
SECRET KEY: ASLKDA:KSDLASKDLASKD
BUCKET: test-bucket18
Details:
S3 BUCKET REGION: This one is a little odd because S3 doesn't require a specific region, use-east-2 didn't seem to work as it told me it had an issue with the "how would you liek to choose your filename" part so I switched it to us-east-1 and it was fine
WHAT FILE TYPE WOULD YOU LIKE TO IMPORT: CSV
HOW WOULD YOU LIKE TO CHOOSE YOUR FILENAME? CompleteFileName
ENTER COMPLETE FILEPATH: Domo/Test_Domo_S3.csv
FILE COMPRESSION TYPE: None
ZIP FILE ENCODING: Default, UTF-8 though it's N/A since it's not compressed.
ARE HEADERS PRESENT IN CSV FILE: Yes
SELECT DELIMITING CHARACTER: Comma (,)
QUOTE CHARACTER: Default, Double Quote (")
ESCAPE CHARACTER: Default, No Escape character
ADD FILENAME COLUMN: Default, No
FILE ENCODING: Default, UTF-8
**Make sure to like any users posts that helped you and accept the ones who solved your issue.**0 -
Hey GuitarHero
Thanks - it looks like you're using the S3 connector in the platform vs using the workbench tool, is that correct?
I've been able to use the S3 connector in the platform, I was wondering more specifically about connecting via workbench.
0 -
Clearly it's a friday and my brain decided to skip over words, my bad. I'll see if I can do a test for workbench
**Make sure to like any users posts that helped you and accept the ones who solved your issue.**0 -
Can you give a brief explanation of your use case? I feel like I'm missing something with what you're trying to do and why.
**Make sure to like any users posts that helped you and accept the ones who solved your issue.**0 -
I'm trying to explore the use of workbench to pull s3 files because of certain limitations of the s3 connector when applied to my use case.
The limitation I find in the S3 connector is when you are not pulling in a specific named file, you are limited to the partial file name, and then it pulls in whatever the most recent file is with the partial file name. The way I am receiving these logs is a copy process from another bucket and there's not a good way for me to anticipate 1. The number of files that will be generated ( the files are segmented into 2 MM rows, depending on the traffic that could be more or less on a given day) 2. Which files will be copied at the most recent time. Additionally, the files live in a folder that is dt=whatever the date may be. I can't dynamically define which day the S3 connector in the platform should pick up in that process. I was hoping to explore a way around that in a query in workbench that allows me to define the constraints beyond file name.
0 -
I've been told by my AM that workbench isn't intended for configuration with cloud services, more local files,e tc. Good to know. Thanks for the replies.
0 -
That makes sense why I was confused, I thought you might have something custom I wasn't aware of. how it worked. Thanks for the update.
**Make sure to like any users posts that helped you and accept the ones who solved your issue.**0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 102 SQL DataFlows
- 626 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 754 Beast Mode
- 61 App Studio
- 41 Variables
- 693 Automate
- 178 Apps
- 456 APIs & Domo Developer
- 49 Workflows
- 10 DomoAI
- 38 Predict
- 16 Jupyter Workspaces
- 22 R & Python Tiles
- 398 Distribute
- 115 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 130 Manage
- 127 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 11 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 110 Community Announcements
- 4.8K Archive