AS Coach

Comments

  • Also, a privacy setting exists to only allow the owner and administrators to edit the dataflow. This is very helpful and is set on each individual dataflow.
  • We have several TVs connected to computers just for this purpose. @n8isjack-ret do you know if there are plans for something like a Chromecast or Alexa or Roku channel?
  • One thing we've done to help in this regard is to add business day columns to affected datasets. We have a separate calendar dataset that includes, for each day, what business day of the month, quarter, and week each day is. We combine that data into our other, transactional data. Then it's a simple math operation to find…
  • Have you checked out the Magic ETL collapse/uncollapse functions? It's pretty useful to pivoting a data grid.
  • Yes. Properties menu "Value Scale (X)", scroll down to the bottom to see Log Scale box. Check that.
  • Hm. I mean, if the transforms run in the edit mode but not in full runs, based on the message, maybe there's a data size issue. Are you running the proper indexes, too? I hope Support has something to share. Let us know!
  • I'm guessing you succesfully ran the transforms while in edit mode, and then this error came up? Does the same error appear with subsequent attempts to manually run the dataflow? This message seems like an AWS system problem more than anything, not a configuration or syntax issue on your part.
  • Those two connector names are either the same (SMTP is an email transfer protocol), or different versions of the same. I wonder if they changed the name on the back end but kept everything else mostly the same.
  • That seems like a strange requirement. I'd push back on that if you can. If that doesn't work, my next suggestion is to break up the file, load the pieces individually to Domo, and then union them back together with a dataflow or data fusion.
  • Agreed with responses above. Even if you can I'd have to question if that's something you want to maintain. Maybe if you let us know about why you have 100 sources to begin with. There may be solutions to pre-combine that data before loading into Domo.
  • Grouped bars and the like have different aggregation dimensionality than single bars, but they both use the single row limit feature. You're trying to limit to aggregated groups, like top 5 reps but showing their products sold, compared to just top 5 reps in the single bars. Domo's row limit is trying to give you 5 highest…
  • As far as I know, yes. I've never seen Workbench care about the size of a file. As I mentioned before, one of our large CSVs is placed in an SFTP folder and is uploaded every day by Workbench:
  • Such a good question and a big need to fulfill. Domo's out of the box cards aren't very good at nested, heirarchical financial statements. And row level value versus aggregated value comparison can be difficult as well. Frankly, I've used Domo app store apps or custom apps for the best financial reporting in Domo. In case…
  • That's a good call-out. We also only allow admins access to workbench (3 or 4 of us), and usually have separation of duties over different jobs, so we don't often have those collissions. If we potentially do, we'll contact the other admins to make sure nobody else is making changes. We, too, have overwritten recent changes…
  • Upload your spreadsheets individually and then create a dataflow. Using Magic ETL or a SQL dataflow, you can link your files on common columns, like customer ID or employee ID, as you would in a vlookup. In the same Magic ETL or a SQL dataflow, you can select which columns you want as a part of your final output (a fourth,…
  • If you haven't yet, you could try using Domo's Workbench application. I'm unaware of any size limitations. Every day we use Workbench to upload a CSV that's over 300MB.
  • I'm not familiar with that connector. Perhaps another community member or Domo support has an answer to that.
  • Are you still having that problem? I was, too, until recently. Now the side panel only opens sometimes.
  • I usually get this error message if someone else is logged in to my server. It's never stopped existing jobs from running, but I haven't tried saving anything while I have this error. You might consider speaking with the Workbench experts at Domo.
  • To generate a token you have to be an administrator in Domo. On the admin page, under the Security menu option, the first submenu option is Access Tokens. This is where you can provision (and revoke) tokens
  • How are you loading that data to Domo? Does the dataset not update the first time it gets to zero, or the first time it goes from zero to zero? Domo workbench has a configuration setting to "upload even if data hasn't changed." Also, there's another setting to be aware of. From the knowledge base: Clear Domo DataSet if…
  • Domo doesn't yet handle delta data the best, but they just recently released into beta a dataflow option to help speed up dataflows. Instead of processing the entire inputs from scratch, Domo has introduced a setting that lets you process just the append portion. On the backend Domo has each append batch assigned its own…
  • Hi Oliver We have a situation at our company where we have sales goals for products, but sometimes no actual sales are made (per month, or per rep, or per whatever), and we still want to know. So in a dataflow, I create a matrix of all the dimension possibilities and fill in the gaps with the actual data where it exists.…
  • Not that I'm aware of, but beastmode administration is something they're working on, and the "shared" flag is probably an attribute available in it. Your current workaround would be to just go to a dataset and open a card and evaluate each beast mode individually to find whether the checkbox has been set.
  • You probably have a value in the column that's too long or the datatype is varying length for each value, so the indexing has trouble identifying unique values. Check out: https://techjourney.net/mysql-error-1170-42000-blobtext-column-used-in-key-specification-without-a-key-length/ Maybe try forcing the column to a certain…
  • We're going on 4 years now and have had both success and struggles with the same. As far as sharing content goes, we've moved almost exclusively to PDPs and away from publication groups. The consistency helps on an organization level, and it's more secure. The challenge is making sure policies are consistent across…
  • I think one of the parentheses is out of order. And we can eliminate the *100 since the card can be formatted to improve readability. CASE WHEN `override_payee_id` = 'MEADBIMT' THEN AVG(.15) ELSE SUM(`total_charge` - `override_pay_amt` - IFNULL(`NewAmount`,0)) / SUM(IFNULL(`total_charge`,0)) END
  • There may be an issue with aggregated outputs versus nonaggregated outputs. Aggregations in case statements can be tricky. We could simplify this a little bit also. CASE WHEN `override_payee_id` = 'MEADBIMT' THEN AVG(.15) ELSE (SUM( `total_charge` - `override_pay_amt` - IFNULL(`NewAmount`,0) /…
  • Try taking that case statement out of the division. This might not work exactly as scripted, but use this for concept: CASE WHEN `override_payee_id` = 'MEADBIMT' THEN .15 ELSE SUM( CASE WHEN IFNULL(`NewAmount`,0) = 0 THEN `total_charge` - `override_pay_amt` ELSE `total_charge` - `override_pay_amt` - `NewAmount` END ) /…
  • I'm not sure how your data is set up, but you're comparing the ID of the individual truck to the date. That doesn't really compute. Counting unique 'true' values will give you either a 0 or a 1, which probably isn't correct either. Maybe you need something more like COUNT(DISTINCT(CASE WHEN [shipment date column] =…