Remove Duplicates for Append Before it Gets to Domo

I have some data i want to bring in but unfortunately the source data is a "Keep most recent X records" not a "Keep the last X days records". 

 

One option I know would be to just append it daily and then have a dataflow that removes the duplicates and it would give me what I'm looking for, however this would mean my original Domo dataset would become bloated (~365000 rows/yr). How might I remove duplicates before it gets to Domo?



**Make sure to like any users posts that helped you and accept the ones who solved your issue.**

Best Answer

  • AS
    AS Coach
    Answer ✓

    Sounds to me like you'd have to do this programatically using Domo's developer APIs. You could pull the data from Domo, add and deduplicate source system data, and reupload the cleaned data back to Domo.  Repeated on whatever cadence you need.

    Aside from that, using a dataflow is actually a good idea, and 365k annual rows isn't a huge deal in my book.  We have datasets in the tens of millions and there are customers with hundreds of millions or billions in their Domo instances. That seems like a secondary concern to me.

    Aaron
    MajorDomo @ Merit Medical

    **Say "Thanks" by clicking the heart in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"

Answers

  • AS
    AS Coach
    Answer ✓

    Sounds to me like you'd have to do this programatically using Domo's developer APIs. You could pull the data from Domo, add and deduplicate source system data, and reupload the cleaned data back to Domo.  Repeated on whatever cadence you need.

    Aside from that, using a dataflow is actually a good idea, and 365k annual rows isn't a huge deal in my book.  We have datasets in the tens of millions and there are customers with hundreds of millions or billions in their Domo instances. That seems like a secondary concern to me.

    Aaron
    MajorDomo @ Merit Medical

    **Say "Thanks" by clicking the heart in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"