Details about, and alternatives to, the DataSet query API

I gave the datasets/query/execute v1 API a try today, and have a few questions:

* What idiom of SQL is this using? It took me some time to get the name escaping done in a way that Domo tolerated.

* Is there any dedicated documentation on the system, other than the sparse details I've found at 

https://developer.domo.com/docs/dataset-api-reference/dataset#Query%20a%20DataSet

* Is there a way to get the results as something other than an array of JSON objects, like TSV/CSV, or something else a bit more compact?

* Are there better, more recent alternatives to datasets/query for querying DataSets?

Thanks!


Tagged:

Answers

  • jaeW_at_Onyx
    jaeW_at_Onyx Coach
    edited February 2021

    @DataSquirrel , seems like this is an extension of your other question.


    What do you want to do? are you trying to query tables or administer your instance of Domo?


    Domo does have an ODBC driver but it is a premium feature

    https://knowledge.domo.com/Connect/Connecting_to_Data_Using_Other_Methods/Domo_ODBC_Data_Driver


    I hear your desire for a CSV, given the larger data volumes that Domo data tends to come in, i can appreciate how reluctant they would be to make it easy for folks to just 'download as CSV. Could get expensive really fast if people could just download multiple gigabyte exports of data from domo hourly for no reason.


    there are tools for archiving Domo data... I can appreciate Domo wanting users to use the 'right tool for the job'.


    Use the CLI instead of querying the APIs directly. (in the documentation it shows you how it can be scripted to run headless

    I believe you can use the CLI to package the JSON response of a dataset query into a CSV. just type 'help'


    Jae Wilson
    Check out my 🎥 Domo Training YouTube Channel 👨‍💻

    **Say "Thanks" by clicking the ❤️ in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"
  • Thanks, here too, for the quick answer.

    I'm in a rediscovering-features-and-options mode, as I'm working out some architecture for the next 18 months. And with my Domo feature set knowledge years out-of-date, I'm trying to see what's in the box, and to think outside of the box a bit ;-)

    In this case, I have no specific need for the DataSet query except that we have a few places where our customers want data drawn from Domo in a specialized way. It would be quite nice to be able to pull data from time-to-time with a SQL query. As you mention an ODBC access schema, I'm guessing "whatever SQL is compatible with whatever version of ODBC the Domo connector supports." I hate ODBC with the heat of a thousand fiery suns, but, well, good that it's an option.

    And on formats, it's just for wire format efficiency. JSON is the best for parsing, but the fully-exploded format that the query API uses is wildly inefficient. So, like you say, not really a great match for large data downloads. That's not something I can see us ever actually needing, but your point makes sense generally.

    In our case, the source data is mostly housed in Postgres to start with, so it's easy to get things out of there as we like. But if we have a Domo-only calculated column (Beast Mode), or want to use Domo column names (very different conventions in Postgres), or see things from a Domo point-of-view, a query would be nice.

  • If you're looking at Data Distribution (especially as a monetized service) Domo does have a suite of tools under the umbrella "Domo Everywhere"

    The video below is a take on a specific product "Domo Publish" which is possibly a lot more than you're looking for. If you just want to send your customers data 'drawn from domo in a specialized way' look at sending automated reports. Consider the 'campaign app' -- think Mailchimp backed with data. Or using Domo Embed (under Domom Everhwhere umbrella) to create self service / embedded cards in a portal.

    if you just want to send people ugly CSVs ;) I would probably see if i can build my view in Domo, DSV, and then export via PyDomo or JavACLI (nice thing about scripting it that way would be that as part of the scripting process as you loop over your extracts you can also automate emailing AND minimize the amount of code you have to maintain.

    Nice part about PyDomo is that the SDK (i'm pretty sure) will generate a clean dataframe for you and then you can easily write to CSV / implement automation using python.


    Re: ODBC query. that feature was implemented to support a customer who wanted to have a SQL-esque way of accessing the data, so the driver can be setup and you could theoretically write SQL commands against your dataset from any analytics tool / IDE. (I personally don't know if it's worth the cost ... but htat's just me).



    Jae Wilson
    Check out my 🎥 Domo Training YouTube Channel 👨‍💻

    **Say "Thanks" by clicking the ❤️ in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"
  • bdavis
    bdavis Contributor

    The formatting of the SQL query is a pain in the butt, but I use it frequently. The real limitation to be aware of is that it will cap at 1 million rows. I think there is a way to get CSV results, however it's not from the query API call, but from another one. I can't recall the exact name off the top of my head, but the equivalent of "get dataset". It returns the entire set as a CSV object. Not sure if that helps your case.

  • DataSquirrel
    DataSquirrel Contributor

    @bdavis , thanks for the information, much appreciated. I'm currently waiting to see if we can get the Magic ETL 2.0 beta, and UPSERT on SQL DataFlows beta enabled. From there, I think that I'll revisit everything again, including the APIs.

  • @bdavis with query-data in the CLI you can send properly formatted SQL (use MySQL syntax)... not sure it gets much easier than that ;)

    Also you can apply a LIMIT clause to paginate over large results.


    I will say, if you're expecting larger than 1M rows should probably consider another egress method (writeback connector -- premium feature in Magic, or export to S3 -- also in the CLI)


    @DataSquirrel , the feature to UPSER on SQL dataflows does not exist. You can enable UPSERT on a dataset that happens to be the output of a SQL Dataflow ... but be careful because that's not built into product and probably not supported.

    Jae Wilson
    Check out my 🎥 Domo Training YouTube Channel 👨‍💻

    **Say "Thanks" by clicking the ❤️ in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"
  • DataSquirrel
    DataSquirrel Contributor

    Thanks @jaeW_at_Onyx , as always. I'm waiting for the dust to settle after Domopalooza, and then will circle back and try everything again. (Waiting on Magic 2.0 at the moment.) I've got your information added to my list. Which is mostly other information from you ;-)

  • AnwarBham
    AnwarBham Contributor

    2 years later and i still fail to see domo updating the api docs

    https://developer.domo.com/docs/dataset-api-reference/dataset#Query%20a%20DataSet

  • I know for a fact they are working on a project in that space. But TBF the query API DOES work and hasn't changed.

    Jae Wilson
    Check out my 🎥 Domo Training YouTube Channel 👨‍💻

    **Say "Thanks" by clicking the ❤️ in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"