I have AWS ELB log files stored in S3 that I want to load into Domo

mberkeley
Contributor

in Connectors
I have been unable to get the S3 Advanced connector to process the file. The connector has successfully connected to the correct bucket in S3.
The logs are space-delimited text files that use double quotes around multi-word elements.
The files do not contain headers. Since there are no headers, how do I name the columns in the dataset?
The files are stored in .gz format.
Below is how I have the connector setup:
Any help is appreciated.
Tagged:
0
Best Answer
-
self-solved: I removed the Escape Character setting and the data imported.
0
Answers
-
self-solved: I removed the Escape Character setting and the data imported.
0
Categories
- All Categories
- 1.9K Product Ideas
- 1.9K Ideas Exchange
- 1.6K Connect
- 1.3K Connectors
- 305 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 3K Transform
- 107 SQL DataFlows
- 648 Datasets
- 2.2K Magic ETL
- 4K Visualize
- 2.5K Charting
- 775 Beast Mode
- 75 App Studio
- 43 Variables
- 734 Automate
- 186 Apps
- 471 APIs & Domo Developer
- 63 Workflows
- 14 DomoAI
- 40 Predict
- 17 Jupyter Workspaces
- 23 R & Python Tiles
- 403 Distribute
- 117 Domo Everywhere
- 277 Scheduled Reports
- 9 Software Integrations
- 137 Manage
- 134 Governance & Security
- 8 Domo Community Gallery
- 44 Product Releases
- 12 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 113 Community Announcements
- 4.8K Archive