Is PyDomo appropriate for my data update work?

Options

Hello. I've been reviewing different methods of updating data in Domo. I'm currently using Workbench, but am considering creating a completely hands-off setup using PyDomo. So, here's my scenario:

We have a number of datasets that originate from ServiceNow. I put those through some local processing using Python scripts, which generate CSV files. Those files are used by Workbench to update the Domo datasets. Good so far, although Workbench seems to have some minor quirks. But this raises a reasonable question: As long as all of this work is already being done in Python, why not just push the changes directly from there?

The only thing holding me back is this statement on the README for the SDK:

Use DataSets for fairly static data sources that only require occasional updates via data replacement

We have 7 datasets that may have, depending on the set, anywhere from 20 to 2,500 updates or additions out to datasets containing up to 175,000 rows. I wouldn't call these either or "occasional" or only "replacement", but perhaps I'm not thinking at the same scale as the Domo engineers. So how about it... Is PyDomo a viable way to make changes like this?

Tagged:

Best Answer

  • GrantSmith
    GrantSmith Coach
    Answer ✓
    Options

    Yes, I use pydomo for a lot of scripts and it works well. There are two update methods, data sets and streams. You can use streams to upload data in segments and finally commit the stream when it's done which are better for larger datasets.

    Overall yes, I'd recommend pydomo in your use case as it'll help streamline your data pipeline

    **Was this post helpful? Click Agree or Like below**
    **Did this solve your problem? Accept it as a solution!**

Answers

  • GrantSmith
    GrantSmith Coach
    Answer ✓
    Options

    Yes, I use pydomo for a lot of scripts and it works well. There are two update methods, data sets and streams. You can use streams to upload data in segments and finally commit the stream when it's done which are better for larger datasets.

    Overall yes, I'd recommend pydomo in your use case as it'll help streamline your data pipeline

    **Was this post helpful? Click Agree or Like below**
    **Did this solve your problem? Accept it as a solution!**
  • froghunter
    Options

    Thank you. I appreciate the response, particularly from direct experience. I'll explore this further… I look forward to this being a completely hands-off process.

    Out of curiosity, have you done anything like syncing data from within a Databrick or similar context?