Ben_Dunlap Member

Comments

  • Thanks again all. I'm definitely seeing the appeal of Magic 2. I think we will probably end up with some sort of compromise where we still use a fair amount of SQL but do heavier-duty joins in Magic or in the underlying Postgres db. Just discovered the Java CLI thanks to a post elsewhere by @jaeW_at_Onyx, so now we have…
  • Thanks all, we're using indexes in our DFs and we've pushed some of our transforms down into the Postgres layer (our key raw datasets are coming out of a read-only Postgres db that our vendor provides to us). Both of these optimizations help a lot with performance. @ST_-Superman-_ we had a few reasons for migrating to SQL:…
  • I'm sure you're right about performance but we have other reasons apart from performance for wanting to stick with SQL dataflows. We're actually just wrapping up a major project of migrating all of our ETLs to SQL dataflows. For the most part the run times aren't a problem, and even in this case we can live with the long…
  • Thanks, I'll check out that breakout session. We're kind of committed to using SQL dataflows for various reasons, at least for transforms of any complexity. These datasets are just going to keep getting bigger in any case so I'm really more interested in specific techniques for breaking large, mostly-static datasets down…
  • I think OAuth 2 since the credential config includes a 'scopes' parameter... I don't think that's part of OAuth 1
  • I'm looking closer and seeing that my first post was a little too general. The basic QB connector does seem to have access to most, if not all, transaction records (Sales Receipts, Bills, Purchases, Journal Entries, etc.). And also Accounts from the chart of accounts. But not Customers or Vendors... anyone know of a…
  • That said, something like the dummy-dataset idea should work for me for now. I can configure my smallest input dataset (less than 100 rows, and unlikely to get much larger) to update hourly and that should take care of it. Thanks!
  • A more sophisticated way to communicate this to the Domo datalow UI would be incredibly useful. "Run when ALL inputs have been successfully run & updated once on the current day". Yes, a config option like that would help. On the other hand it does seem like this is actually a bug. I think it's pretty reasonable to…
  • finally got the redirect URL from Support! It's https://oauth.domo.com/api/data/v1/oauth/providers/json-oauth/exchange I've tested this and it works. Hope that helps others. Really frustrating that this is not documented anywhere as far as I can tell. The connector does not work at all without this URL.
  • weirdly, I just got a release-notes pop-up when logging into Domo that included an announcement that this JSON+OAuth connector that I've been struggling with will be released on Oct 15. This might explain the lack of documentation. Maybe my instance was somehow included in a soft launch of it.
  • Umm, nevermind, my code was just wrong and I wasn't actually setting the "Authorization" header ?