gllrusso Member

Comments

  • Just FYI, we opted for a python script working with Stream API. Creating a connector was not feasible for us.
  • We did not think about Workbench because we would like to avoid needing a local machine running this and if I'm not wrong this is what Workbench is based on, am I right? May a custom connector be a possible solution to avoid local applications?
  • Okay, here we are with the solution to this problem! Special thanks to @jaeW_at_Onyx for all the steps. I've published https://gist.github.com/giuseppellrusso/03daabba54424ceac83309921121320c#file-domo_utils-py with my module developed to programmatically run a dataset or a dataflow. As you can see from the code, running a…
  • Hi Jae, of course I'm going to do a writeup for the whole community! Let's start (maybe at the end I will host this code somewhere improving it and commenting properly) with dataset powered into Domo through a connector: this is the solution you can use to run it with Python and Pydomo using supported and documented API.…
  • First of all, thank you so much for your answers. It is really helping me. My goal here is to have an external scheduler that let me manage different executions in different systems, not only in Domo. That's why I'm trying to access all the systems through their API, so I can actually write my manager as I wish with my…
  • @jaeW_at_Onyx thanks for your fast answer! I think either scripting the CLI commands or using the corresponding APIs will let me to the job. I have actually two questions for you, one for datasets and one for dataflows. 1) Dataflows What is the difference between using the dataflow-run-now -i <ID> command and the…
  • Hello and thanks for the answer. The UPSERT solution is not good for me because in the last days of data I may need to delete some rows, thus if I delete them between an UPSERT and the other I will have the deleted rows in the output dataset, and this is a problem for me. Since no DELETE option is given (without an…