SFDC Connector - Delta/Incremental
Hello,
We use SFDC Datasets, and we've always pulled the entire dataset. I've noticed their are options to do incremental/delta loads. I am curious how to get started with these options. Ideally i'd like to pull only new or changed records on a nightly basis as opposed to full data loads.
What i'm looking for is some guidance on how to properly configure this, i tried reviewing the documentation but i couldn't determine the proper approach to configure the selections on the SFDC Dataset Connector.
1. Load history
2. Begin loading only the changes nightly (which covers the past and the future)
Thank you!
Best Answer
-
Hi Nick -- I see where you are going with this. One possible solution could be as follows:
- Import only the last day's data based on system Mod Stamp
- Compare the new data to the existing dataset in Domo.
- Based on the lastest day's data and the dataset Unique identifier (opportunityID, AccountID etc.) recursively pull in the latest updates.
I believe this could work with a little testing. I'm leaving the dojo day session, but you can reach out to me with more specifics of the dataset if you would like.
I work for Domo.
**Say "Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem as "Accepted Solution"1
Answers
-
Hi -- the SFDC connector is pretty robust. So on datasets less than a few hundred thousand rows, replace should work well.
On larger datasets, you would need to run the data through a recursive dataflow and then deduplicate the results to find the latest updates. Instructions for SQL and MagicETL recursive dataflows are below:
I work for Domo.
**Say "Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem as "Accepted Solution"1 -
PS ... Hi Nick. Just saw it was you.
I work for Domo.
**Say "Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem as "Accepted Solution"1 -
Hi Jared,
The image below of our SFDC connector is what i was specifically referring too...unless this is used mostly as strictly targeted data, or for one-offs. I was hoping i could leverage this functionality to load history, then only load changes, i just wasn't able to determine the best usage from the existing documentation.
Happy to connect offline as well if that makes more sense.
I noticed that when i use 'Browse Objects and Fields' for the 'Opportunity' object, i get the following options:
1 -
Hi Nick -- I see where you are going with this. One possible solution could be as follows:
- Import only the last day's data based on system Mod Stamp
- Compare the new data to the existing dataset in Domo.
- Based on the lastest day's data and the dataset Unique identifier (opportunityID, AccountID etc.) recursively pull in the latest updates.
I believe this could work with a little testing. I'm leaving the dojo day session, but you can reach out to me with more specifics of the dataset if you would like.
I work for Domo.
**Say "Thanks" by clicking the thumbs up in the post that helped you.
**Please mark the post that solves your problem as "Accepted Solution"1
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 738 Beast Mode
- 57 App Studio
- 40 Variables
- 685 Automate
- 176 Apps
- 452 APIs & Domo Developer
- 47 Workflows
- 10 DomoAI
- 36 Predict
- 15 Jupyter Workspaces
- 21 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 124 Manage
- 121 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive