New Netsuite connector - Merge Not Working
Finally:
The Netsuite connector now has a feature to 'Merge' (UPSERT) records from a saved search and you get to identify the primary key of the record (to check if it exists).
Since there was no information I could find about the 'Remove Duplicate' records feature (ie: What does Domo check to figure out what is a duplicate?). For example, I had a bad cost on an invoice, I fixed it and since I changed it (last modified date) it was going to be uploaded again. Is that going to replace the "bad" invoice in Domo?
So I had throught the 'Merge' was going to save the day by updating the bad record with the new "fixed" one (with the same unique key field).
Bzzzt ... Wrong Answer.
I changed my connector to use 'Merge' (vs Replace or Append) and if failed. No error message other than 'It failed.. try again' (Of course I did that... a few times).
Hey Domo, do you read these forums, any insite on this.
(btw: New to Domo but not new to Data Whse and ETL / SQL concepts)
Comments
-
Interesting. Assuming the Netsuite API can handle the load of your dataset in one call or unless you have older records on the dataset you'd like to retain (based on the modified date), why not set it to replace, and re-run your entire historical record again? It should replace the bad record.
No idea why the 'Merge' wouldn't work (sometimes new connector options can be buggy), but you can always dataflow and remove duplicates in the ETL tool, using a match on the record key, name, etc. (excluding the modified date) to keep a running list of unique, updated records.
It might help to do a recursive dataflow so it's easier to append and re-run bad records in the future as well. Hope this helps.
0 -
You are correct, the data set is huge (Daily Transactions since dawn of time and new ones every day). Netsuite connector cannot handle even extracting more than 2-3 months of transactions at a time. So, there is no chance of re-building that daily. The merge looked like it was an answer.
Was more concerned about running an ETL that has to "re-process" (recursive data flow) the entire dataset each time the source data is updated. I kinda feel like the recursive dataflows are done to overcome a platform design flaw.
Thanks for the note.
0 -
100% agree re: recursives were meant to overcome a flaw - back when i started using domo, no connectors had the 'since last successful run' option, meaning you were timebound on a GMT daily update if you wanted to append data to the raw set.
What i found is the recursive lets me replace single records when i encounter similar issues. I just re-run for the date range that the record is in (in the absence of single-record specification), and simply insert the remove duplicates ETL tool for that run, ensuring my updated record stays and the old one goes. None of the other rows should be affected, and i wouldn't worry about the re-processing of the entire data set; we re-run millions of rows a day without delay.
Either way I hope that connector gets updated to work properly...drop a ticket!
0 -
I built my Saved Search in NetSuite to import the full dataset in chunks at first (about one quarter's worth at a time). Once I had everything in DOMO, I set the Saved Search to only return transactions that were created or modified in the last day or so.
That seems to work to keep the Merge running, and handles my deltas.
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 101 SQL DataFlows
- 622 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 748 Beast Mode
- 59 App Studio
- 41 Variables
- 686 Automate
- 176 Apps
- 453 APIs & Domo Developer
- 47 Workflows
- 10 DomoAI
- 36 Predict
- 15 Jupyter Workspaces
- 21 R & Python Tiles
- 396 Distribute
- 113 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 125 Manage
- 122 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 109 Community Announcements
- 4.8K Archive