RobB Domo Employee

Comments

  • I'm reading the recommended workaround. I just created two of these connectors today and neither had scheduling options.
  • @GrantSmith It's good to know it is just a warning that doesn't impact the functionality. If your output is a large number of columns, it is still a bothersome thing. This is especially true if you're doing a loop that outputs its iteration results to dataset. Then it becomes a burden. As a temporary fix, you can use the…
  • @Carlos_Yanez Have you tried providing a datetime format for the pd.to_datetime() method. I've found it best to dictate to this method exactly how to recognize the datetime: df['column'] = pd.to_datetime(df['column'],format='%Y-%m-%d') This example will tell the method to recognize a date in a specific format. In this case…
  • I realize this is late to the game, but this is a good discussion that hasn't ever been answered. What I noticed in @thinh_dao's OP is the datatype for the examples that failed to convert. They're not all lower-case. Int64 and Float64 are used instead of the lower-case equivalents. The best practice for data types is to…
  • Thank you, @DavidChurchman for your efforts. Also thanks to all who make keep this community active. I hope to be more a part of that in the future. Thank you @GrantStowell for the shout out. I have been absent from this community as of late. It feels good to be back and seeing not only the level of engagement but also the…
  • @Jessica I don't think it's in use anywhere for URL or other dataflow identifiers you may have access to. The dataflow ID that we're all accustomed to is what you will use in the URL: https://yourinstance/datacenter/dataflows/42/details I've never had an occasion that required the use of the Dataflow DAP ID in the web…
    in DAP Dataflow Comment by RobB August 2023
  • I tested accessing a dataflow by the DAP ID and it works, so it appears is a valid identifier of a dataflow. DAP is an old acronym I recognize for Directory Access Protocol or the X.500 standard. (LDAP is a more common implementation of this these days). Here is my best guess: It's identifying a dataflow using a value that…
    in DAP Dataflow Comment by RobB August 2023
  • @user029082, You can also start by trying some things in Postman. Postman has a GraphQL client that could return to you your schema. Their demo in their documenation shows that and I tried it with my GraphQL API and it returned it to me. It's a good way to get familiar. But that's not all. Postman's documentation shows how…
  • I'm going to see if there's a way to do this with the CLI. I don't think it can do this, but I want to try some things and see. If not, I'm confident using either developer tools or Jupyter notebook integration could get us the detail. Would you like go down that path if possible?
  • @MajorReportingSir , To achieve this without the aid of DDX Bricks or Devloper Tools, you will need to "stack" the data by the location types. If you have a separate column for Country, Regions, Cities, etc. Those will need to be a single column holding those values and then another column with the type of location for…
  • Although it's been a while since this was addressed, for the sake of others searching for an answer to the same question, there is a way to do this. A detailed thread about this can be found at the Dojo post below: User-defined variables in SQL queries/transformation — Dojo Community (domo.com)
  • @MarkSnodgrass , Thank you for the corroboration. This was my expected direction after bringing the dimensions online. I think that @GrantSmith was also going this way. It's a lot of records but to get a 30 day history record by record, it's what we have to do. Thanks to @jaeW_at_Onyx as well, the conversation was very…
  • @jaeW_at_Onyx I fear I may be misunderstanding this. My lag isn't from the current date. My lag needs to use the date on each record as its fixed point. For each SKU, for each date, on each transaction, I need to go back 30 days from that point to get my numbers. To me, that means cartesian, as ugly as that will be. A left…
  • @GrantSmith I looking for 30 days from the date of the row.