Comments
-
Update: The error goes away as long as I call the get metadata api before hand, like so: from pydomo import Domo import logging import json domo_id = Insert domo api user ID as string here domo_secret = Insert Domo secret as string here dataset_id = Insert Dataset ID as string here destination_path =…
-
Are limited API keys on the roadmap at any point? We deal with HIPAA compliance and not being able to limit access by Key severely restricts our ability to automate securely. We can't have an API key that can access PHI floating out in an AWS Lambda for example. Thanks,
-
@DrR Looks like it is in the Februrary release notes!
-
Oooh that is really useful knowledge! Thanks for the info Aaron.
-
Yea! That's exactly what it would be - although instead of referencing the analyzer itself it could be accomplished by allowing for table level calculations in beastmodes similar to windowed calculations for Tableau. This has been suggested in idea here - unfortunately it seems to have lost steam. I've looked into building…
-
@northshorehiker - We've built "Export JSON to shared directory" as a mandatory activity while making/updating jobs in workbench then just run a small script to combine all of the JSON files together and then consume them as a list. I've posted on the ideas exchange about building out more robust data lineage…
-
Thanks for the details! Its helpful to understand the back end processes - and thats actually a great way to handle it (short of automatically updating all the SQL). I think rewriting all of the SQL will go under the "Once we have a summer intern..." column.
-
We do something similar and use a Domo Online Form to track escalation points by ID, then join that into our dataset on the ID using the status column as a series in the visual.
-
For anyone who runs across this thread and is encountering a similar dilemma here is an important update. There are 2 commands to run domo jobs for the command line (I could only find 1 in the documentation): * queue-job - using this command places the job in a queue to be run one at a time. When using this command the CLI…
-
Might have to reach out to that team... Do you happen to know if there is a forum or listing of the plugins that have already been produced by others? working on another similar issue that would help with.
-
A Domo engineer was able to help me out with this one and the answer is: I would say that it depends on the job. A job with many rows / columns is more memory intensive than a job with a small amount of rows / columns. The same is true with CPU usage. CPU is also impacted by the number of jobs running simultaneously. As…
-
Not to talk to myself - but with the help with the team we are working with I can answer this question for anyone else wonder something similar. The command line runs the workbench in the context of the machine (background processes), so no job status is reported back to the command line interface. Viewing the logs…
-
Yea, that sounds like it would be very helpful! Thank you!
-
Thanks for your reply! Totally understand why they are assigned and used - for our purposes we want to write scripting that when it sees a file drop into a particular directory it runs a corresponding job based on the file name. I realize I can map this through a flat file however I am being lazy and attempting to minimize…
-
kshah008, Thanks for the bump - I've updated the post a bit which will hopefully be helpful to readers. Best, Austin