S3 Connector - Copy Command

Hello,
I'm here to ask you if it possible somehow to solve the following scenario.
In an external S3 bucket I have hundreds of files for a single table (one file per each day, that is how we partition the table).
The first question I want to ask is: it is possible to import this table by performing something similar to a copy command? I have been trying to use the S3 Advanced Connectors, but I don't think the logic behind the connector is this one because it is taking a lot of time to load, as if it was parsing the json and not simply moving files.
Second question, that is based on the first, is the following: every day we update the table by deleting the last 7 files (that is, the last 7 days) and adding new 7 files containing updated data. Would it be possible to replicate the behavior on Domo, either by just deleting all files and moving them all back to Domo or by just doing this with the last 7 files?
Thanks in advance
Categories
- All Categories
- 1.2K Product Ideas
- 1.2K Ideas Exchange
- 1.4K Connect
- 1.1K Connectors
- 273 Workbench
- 2 Cloud Amplifier
- 3 Federated
- 2.7K Transform
- 78 SQL DataFlows
- 526 Datasets
- 2.1K Magic ETL
- 3K Visualize
- 2.2K Charting
- 441 Beast Mode
- 23 Variables
- 514 Automate
- 115 Apps
- 391 APIs & Domo Developer
- 8 Workflows
- 26 Predict
- 10 Jupyter Workspaces
- 16 R & Python Tiles
- 332 Distribute
- 77 Domo Everywhere
- 255 Scheduled Reports
- 67 Manage
- 67 Governance & Security
- 1 Product Release Questions
- Community Forums
- 41 Getting Started
- 27 Community Member Introductions
- 68 Community Announcements
- 4.8K Archive