Amazon S3 Assumerole Advanced v2 Issues
We pull data from S3 using the S3 Assumerole Advanced v2 connector, and I'm having problems with it. There are three update methods available with this connector: Replace, Append, and Merge. We're bringing in employee data. Here's the problems with the three methods:
- I would prefer to use Merge, but when I choose it it tries to load values into Merge Key Location and times out every time.
- Our current connector uses Append, but that causes the dataset to balloon: we've had less than 100K employees total over the years, but the dataset is now almost 4 million records.
- Replace looked good at first: in a test connector the first run pulled in the correct number of records. However, every subsequent run only pulled in new or changed records, overwriting the full record pull.
If anyone has experience with this connector do you think that it's buggy or that we might possibly have incorrect settings in the S3 account? Any suggestions for alternate connectors to get the data out of S3?
Best Answer
-
You could do a replace method update on your S3 dataset and then feed it into a MagicETL dataflow, which outputs to another dataset, but you can set the output method to partition and define your partition key. This would get around the merge issue you're running into.
As for the merge issue itself how many merge keys are you attempting to load? You might need to reach out to Domo Support about the issue.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0
Answers
-
You could do a replace method update on your S3 dataset and then feed it into a MagicETL dataflow, which outputs to another dataset, but you can set the output method to partition and define your partition key. This would get around the merge issue you're running into.
As for the merge issue itself how many merge keys are you attempting to load? You might need to reach out to Domo Support about the issue.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
Thank you for your response. I will either write a recursive ETL or see if the MySQL engine supports upsert (INSERT INTO with ON DUPLICATE KEY UPDATE) just so I can have a working dataset. I won't be able to use partitioning, however, as I'd want to partition on EmployeeID, and there are way more than the max number of partitions (1500).
As for the merge keys, I assume it's trying to load the column names, and there are only 54 of them. That shouldn't be a problem, so I'll put in a ticket. Thanks again!
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 102 SQL DataFlows
- 626 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 753 Beast Mode
- 61 App Studio
- 41 Variables
- 692 Automate
- 177 Apps
- 456 APIs & Domo Developer
- 49 Workflows
- 10 DomoAI
- 38 Predict
- 16 Jupyter Workspaces
- 22 R & Python Tiles
- 398 Distribute
- 115 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 130 Manage
- 127 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 11 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 110 Community Announcements
- 4.8K Archive