Community Forums

Community Forums

DomoDork Contributor

Overview

Overview
Roles
Member
Badges12

About Me

About Me
Not much happening here, yet.

Analytics

92
509
351

Comments

  • @ARosser - I had something similar to this and the best way I've found to do this was to get my AD group data into a dataset. Then I could use MagicETL to parse out people into my own domo groups and output that to a dataset. I used the Governance Toolkit to generate domo groups based on my dataset. You can read about that…
  • @MayaU_01 - I've seen this behavior before as well and I think it has to do with how views interpret PDP in scenarios where you have two datasets, views on those datasets and then combining said views into a final view. For example, if Dataset 1 has a PDP policy restricting data by 'Department', but dataset 2 has a policy…
  • This is available. We built a view that combines a few domostats datasets together that tells us the view counts of every card on every app along with the total number of views for each app and app page/tab. Just join together the following domostats datasets: App Studio App Pages App Studio Apps Card Loads Card Pages…
  • From what I've heard through the grape vine, this is something being look at. We have the same issue and we need more granular permissions so that we can give some roles 'admin-like' access to some features without giving them all the permissions an admin would have. If I recall what I heard is correct, work is being done…
  • I agree with this. Domo should have a feature, lets call it 'Magic Data Model' that works like MagicETL from a UI perspective, but is a way of connecting your inputs similar to Microsoft Analysis Services, Snowflake or any other snowflake style schema typically used in cubes. Then when it runs, it crunches through it,…
  • @WHM - I've requested the same. I'd just like to see 'WHERE' clause features on input tiles so I can filter directly from the source. This would also reduce ETL processing time and load on downstream systems trying to feed all the data to Domo before it passes through a filter tile.
  • @haydentaylor - you're a madman! It's amazing to see an idea come to life so quickly :) Amazing work.
  • Join expressions are fantastic! However, I still agree with @ArborRose. A simple checkbox for case insensitive joins would be very useful even with the new expression features.
  • I would also include an option on a per dataset basis to hide the data tab in Analyzer. We have a scenario right now where we want people to be able to create cards on an underlying dataset but not allow them to see underlying raw data (regardless of PDP) but permissions aren't granular enough. To get around this at the…
  • Hi @sebastian_galindo There is an undocumented API endpoint you can hit to search/query datasets based on various criteria. I dont have a working example, but if you click on Data in the UI, right click to open the Inspector, go to network and spy on traffic, you will see that Domo hits the following API whenever you add a…
  • I agree with the idea that there needs to be a new JSON No Code (JWT) connector to supplement the standard connector and the OAuth connector. I'm actually running into this same requirement and making a custom connector or using Jupyter/python seems like overkill for something that is pretty standard from a JSON/API…
  • Hi @PMLeema - it's a private client system. But all they want is for us to send data to them via a webhook that requires the header "Authorization: <apikey>". But the JSON Writeback connector doesn't allow you to supply any custom headers so we get a permissions failure when we try to send them data.
  • Hi @damen, Since beast modes are realtime evaluations of data, lookups aren't really something a beast mode would support. Instead I would use a MagicETL to do your 'lookup' logic and output that 2nd mortgage amount as an additional column on your dataset.
  • I wonder if rows at the end of your range in excel has a bunch of blank rows. What happens if you open the xlxs file, go to the first empty row at the bottom of your data, highlight all rows below it and right click → delete rows, then re-save and re-import into Domo?
  • The way I approach this with workbench is to treat any data I bring into domo as a stage via replace instead of append. Then I use a recursive MagicETL with partition processing to take that data from stage and merge it into my final output dataset. @jaeW_at_Onyx has a good series of videos on this topic below. I hope this…
  • I 100% agree with this. If we could group tiles together along with colorize tiles, it would make complex ETLs much easier to work with. I would also extend the colorizing tiles idea such that when you colorize tiles and run a preview to see sample data from each tile it also colors the column headers. It would make it…
  • Hi @pauljames I have had this happen to me twice in the past. The first time it was because the value wasnt actually a NULL but instead a blank. The second time was when I was actually trying to filter out blanks (not NULLs) and there ended up being an invisible character in my blanks that caused the filter to misinterpret…
  • @ArborRose Absolutely. You can target individual datasets, change datatypes, remove columns, rename etc.
  • @ArborRose if you have access to the admin area, there is a schema manager in the governance toolkit to adjust the schema to your liking. We have to do that from time to time when using some connectors because Domo gets it wrong.
  • I've heard that this year there is no livestream but that a link will be sent out for viewing when the event finishes.
  • @ArborRose - I caught that and changed it right after posting. Still getting the same result. Looking at the network inspector what I seem to be getting back during the preview is an internal server 500 error from domo went it tries to execute the curl command. Starting to think I need to go back to the client and see if…
  • @ArborRose according to their python API sample, it should be "Authorization: Basic <base64 encoded id:secret> Looking at cURL documentation, this is the correct and expected way to specify basic authentication in a header: curl \ --header "Content-Type: application/json" \ --header "Authorization: Basic $AUTH" \ So in…
  • We had this limit in one of our datasets. The only way around it was to ask our CSM to increase the column size limit. They can do it on a case-by-case basis.
  • Agreed, this would simplify PDP automation so much and is very much needed.
  • @Jmoreno - I just did a quick test and I was able to pull in the incident data. Just use the JSON No Code Connector, setup a dummy JSON Account as part of the connector configuration (the URL itself doesn't seem to need any authentication but the connector requires an account to auth with), pop in the URL and it will pull…
  • Not sure about subscribing, but it looks like you could in theory use the No Code JSON connector to pull the incident data from into a dataset. From there maybe create an alert when new records are detected.
  • I thought about that, but we were really trying to avoid that kind of credit consumption. :)
  • Hi @TMonty0319, This should do the trick: DATE_FORMAT(STR_TO_DATE(TRIM(SUBSTRING(<your date field>,INSTR(<your date field>,',')+1)), '%d %b %Y %H:%i %p'),'%Y-%m-%d %H:%i:%s') This finds the first occurrence of the comma in your string and removes the day abbreviation since you dont really need that (since the day month and…
  • I would also add that when we used Inline Editing, if you hide columns as part of the editor configuration, it physically deletes the columns from the backend dataset when I don't think it should. It causes a rewrite of the entire table schema and ruins the upsert keys defined on the datasets tied to the editor.
Default Avatar

Welcome!

It looks like you're new here. Members get access to exclusive content, events, rewards, and more. Sign in or register to get started.
Sign In

Badges

  • Second Anniversary
  • 25 Up Votes
  • 5 Agrees
  • 5 Answers
  • First Anniversary
  • 5 Likes
  • 5 Up Votes
  • 5 Awesomes
  • 10 Comments
  • First Answer
  • Name Dropper
  • First Comment