Comments
-
Works perfectly, thank you for your help!
-
-
The SQL worked but that field just came back blank...
-
Using redshift, this is my code: SELECT `Campaign`, `Date`, `Revenue`, CASE WHEN `DateLen` < 19000101 THEN NULL ELSE date(to_char(TRIM(`DateLen`), "99999999")) END AS DateEXTRACTDTE FROM `df_ga_int_emarsys_campaign_attribution_sql_prep` And I'm getting the error: The database reported a syntax error: [Amazon](500310)…
-
Thanks - I'm trying that and running into an error: The database reported a syntax error: FUNCTION to_char does not existSee attached screenshot.
-
Thanks! I knew that it had that function, but I wasn't sure how to write the code so that it'd put out a new column containing the length. So there's no print function? Just "length = " ?
-
I'd prefer to use Python so I can keep it all part of one ETL flow rather than chunk it out into another dataset in MySQL, and a Beastmode won't work because I need to aggregate up within the ETL.
-
Got it, thank you. Also - turns out I didn't need the timezone conversion in there....some wonkiness going on in the DOMO ETL stuff there. Thanks for your help ?
-
Actually, sorry - I was a little hasty in my joy. Looks like some of them are rounding up - any idea why that'd happen? It looks like if it goes over the 15 or 45 minute mark it rounds up - I still need those to round down.
-
Got everything to work as desired, thank you for your help!
-
Thanks Chris. Sorry to ask for such a simple solution here but I can't seem to find it on my own - I'm actually in the Pacific time zone - can you tell me the code for that?
-
Thanks - almost there but the hours don't seem to be correct.
-
Thanks Chris. It's actually stored in DOMO as a Date-Time field - will this still work?
-
Thanks Aaron, this worked perfectly. I didn't need the last statement in there (it actually threw an error), but it worked well enough without it. Thanks again!
-
So the total query would look like this? SELECT `date` ,`ordernumber` ,SUM(`price`) as 'Total Retail' ,SUM(`qtyordered`) as 'Total Units' ,GROUP_CONCAT(DISTINCT `pricecatweb` ORDER BY `pricecatweb`) as 'Mix' FROM `df_int_sql_prep_sale_type` GROUP BY `date` ,`ordernumber` Thanks so much for your help by the way - I really…
-
If I do the ORDER BY before the GROUP_CONCAT, will it first organize the values then remove the duplicates? I'm just trying to eliminate the possible combinations, as when this dataset ran it came back with some 10k+ unique values due to all of the possible ways items were purchased.
-
Will using that ORDER BY statement organize the contents of each value, or just organize the dataset itself?
-
Yes! Perfect! Thank you so much for your help. Edit: Sorry, was a little hasty in my response. This is close, but now I have another issue. It's treating the order differently; for instance: "Regular price, sale price" as different than "Sale price, regular price." Is there a way to account for this?
-
No, I'm using an ETL to join together several datasets to reference off an order-line level dataset (very granular). After joining those together, I use the group by function in ETL for the actions which I've attached.
-
Hi Aaron, Yes, that's correct. I used the 'Combine Strings by ,' operator. Would I need to apply this SQL statement to my original input dataset instead of the one in ETL? I tried running a SQL transform on the ETL output, but ran into an error (my SQL is pretty limited). Thanks for your help! I feel like this is 98% to a…
-
BUMP Anybody have any ideas for the above?
-
Thanks Darius. In the image below, I highlighted the ultimate value I'm trying to get to, which is "score." I highlighted the hierarchy within that as well, but I'm not sure how/which to reference otherwise to exclude (e.g. runWarnings, audits, etc. all at a higher level)
-
Thanks Dani. I've reviewed the DOMO connector guide already and that doesn't really help, and I'm not technically savvy enough to figure out what I need from the API guide (it's a little too technical). Specifically what I'm looking for is just the syntax for these fields:
-
I'm having the same issue - hoping there's a solution to this!