imelendez Member

Comments

  • @trafalger Qb = poopoo 😉
  • I have in the past and it leads to a bunch of finger pointing and no actual solution. :(
  • So it sounds like this is a bug on Domo's end?
  • Thanks @GrantSmith I thought that would be the case. Glad I asked. I appreciate you!
  • The awkward moment when you figure it out and have to post the answer to your own question. What I figured out: If you use the Python SDK, leverage the datasets.create() method to create an API driven dataset. You will need to instantiate the DataSetRequest() object and pass in your Build out your schema for your using DSR…
  • Boo. Yeah I had to reboot the service like 5 times and also kill the service via command prompt in order to get it to work again. So odd.
  • Gents just to circle back on this, we ended up going with the append method and it worked fine for our needs. Much faster than Stacker (awkward look). :)
  • Well look at who will be helping you now, Alexis :) @DataMaven You just need an API key and make sure the user accessing the Zendesk account has admin level access. We used a service account. Go to profile top right click on Admin center Click on Apps and Integrations APIs Generate/Add API Token Copy they Key and paste in…
  • We tested what you recommended (removing duplicates) by adding in a new value to the run and adding in more date constraints to the remove dupes object and it worked like a charm. Thank you so much @MarkSnodgrass ! You are the GOAT!
  • Thanks for that Mark! I am currently removing duplicates and using a DOB column to remove duplicates. My question to you is, let’s say John Smith comes in a month from now through the daily pool, won’t it delete the original John Smith and leave the latest? How does remove duplicates decide? If it does, would this probably…
  • Would something like this still take a long time or do you think performance would improve? Sorry if it sounds like I'm shooting blanks here. Just trying to think of a way to solve this quandary, @GrantSmith . :D Proposed design:
  • So if I am understanding correctly, in ETL's regardless of the method you use, the ETL will scan/read the entire amount of data in the dataset (input source) regardless of how efficient you are trying to be, correct? If so, that is a bummer.
  • @faisalnjit have you had a chance to check out this KB article, brother? https://knowledge.domo.com/Administer/Implementing_Single_Sign-On/Enabling_SSO_with_Okta
  • Hi, @GrantSmith ! Thank you for the suggestion. I was pointed in the general direction of the SQL ETL data flow because I was running into issues with using the formula tile in the Magic ETL to do the convert_tz() conversion. It kept processing an error. I tried it here using this method and it worked. But, generally…
  • Wow very insightful. Now, a follow-up question, I assume when the time changes according to Daylights Savings Time I need to account for this, or should the offset be enough?
  • Wow this is great feedback gentlemen @GrantSmith and @jaeW_at_Onyx . I appreciate the wisdom. I apologize for @jaeW_at_Onyx if I wasn't as clear as I should have been. Let me elaborate. So we are currently doing this original calculation but it does not seem to be working for one of metrics (when doing the duration math…
  • @SeanBand @jaeW_at_Onyx thanks for the feedback. So, if I am understanding correctly, you are saying that if mysql is sending the date data to me via the mysql connector as UTC, and I change my company settings to be my timezone instead of default UTC that should solve the problem? My only concern here is that since the…
  • @GrantSmith I get that, but, I think that has to do with more the company settings. I am just concerned about data that is coming from a data source that is in UTC format and I need to keep the datetime data type but in local time.
  • I am sorry, I might just be confusing myself. I am after appending data from historical (A) to new data changes from live (B) into the final output. 
  • Yeah @GrantSmith so, dataset A should not have future dates as it is static. What I am trying to do is loop in a smaller subset of data from dataset B and check against dataset A if it exists. if it does, filter it out (to avoid dupes), if it doesn't, then append. I did not create this data flow. Does that provide some…
  • @GrantSmith , thanks for the confirmation. Is this more of a realistic approach - in terms of ETL dataflow? The image is attached. A million thanks! Isaiah
  • Hi, @GrantSmith , Thank you for taking the time out of your day to reply and help me out. I am still pretty novice at the whole ETL process and get confused with some of the do's and don'ts regarding recursion. To answer your question in detail, Dataset A (historical dataset) will be a one time pull - it is static and will…
  • Here is the code - sorry for the delay. SUM(CASE WHEN `rpt_job.authorized` = 1 THEN (`rpt_job.subtotal_sales`) / 100 END) Thanks for your help in advance @ST_-Superman-_ 
  • @GrantSmith I have never attempted to do that. How would you do that? Can you provide an example how you would do a beast mode for the nulls as you mentioned?
  • @MarkSnodgrass when you say the structure is the same, are you saying the columns are supposed to be named the same? Or are you saying that the data types are supposed to match? I have attached a screenshot of my current ETL.
  • Yes, I am part of the beta team. I thought they would have rolled it out later but that does not seem like that was the case. All that to say, we lucked out, I guess. ? @guitarhero23 : assuming this was not a pivot table, and let's say a regular table in analyzer view, is there a way to do what I am asking for? Have you…
  • Thanks @Godiepi , the second article worked like a charm!
  • Cool story! Loved reading then putting the words into an image. I can relate, I too, am a workaholic. ? If you are interested I could start a Zoom meeting. Would love to learn from a pro. ? I appreciate your willingness to help.
  • I see, thank you for explaining that. I am working with a support manager at Domo to help us figure this out and he too is scratching his head like we are. I will bring up the NULL value issue to see if it sheds light on something. Let me know if the attachments provided a bit more clarity as to what I was saying.