Jon Member

Comments

  • We have a scenario similar to this. I've chosen, for a variety of reasons, to let Domo do the work. My data (fortunately) has a "week number" embedded. So when I do the updates to historical data, I look for week numbers in my new dataset, and eliminate those from my historical, so that I never get a duplicate week in my…
  • First, yes it's possible to do more that 2000 rows, I built a connector that does THOUSDANDS of rows. The API that I'm calling does pagination, so my loop is done in batches of 1000. My datagrid.addCell() calls are within the loop. Maybe that will get you on the right path. As for nested objects, well, I had to run my…
  • My challenge was that I actually have values that change in the "old" data from time to time (corrections, etc), and the full dataset is too large to retrieve every time. I have an intermediate step (which is not relevant to the question here) which removes any ID that;s in the updated dataset before the union. But, in my…
  • Well, as I learned, when you consume a datasource in a dataflow, you have to make sure you're actually consuming the one that's the output of the dataflow. I think Domo is just particular about that. I'll try to reconstruct what I did: 1. I created a new dataflow, using my existing data, I'll call it…
  • One thing I learned about building recursive datasets, is that you MUST first create the source dataset from within the dataflow. That way you're consuming the dataset from the workflow. Also, your step 5 may be a bit redundant, you should be able to drop the results of that union right into your final dataset. - Jon