Comments
-
Under Chart Types select Tables from the drop down. You're currently looking at the Popular charts. It'll be under Tables.
-
You can utilize a custom date dimension / period dataset to calculate the different metrics for each date and then utilize a beast mode to create the columns for last month / last year / quarter etc. I've done a write up on this method here:…
-
@louiswatson You can utilize a Magic ETL with the Alter Columns tile to convert the datatype of each of your columns and then use that in your cards instead of using beast modes / calculated fields. This way it's only done once and you can build your cards off the converted dataset instead of creating multiple calculated…
-
Assuming you're using the Phoenix DDX Scatter plot example here but you can specify the colors you're graphing in the options dictionary and then pass that into your Chart call when charting your graph. const customColors = [ '#002159', '#03449E', '#0967D2', '#47A3F3', '#BAE3FF' ]; var options = { colors: customColors,…
-
You'll need to reach out to Domo Support as this is likely an issue in the back end storing the data after it's received your data.
-
Did you update the schema via the API or just add the new field to your dataset you're attempting to upload? You need to make sure the dataset schema includes the new field otherwise it'll fail. Domo has an example of this in their pydomo documentation: What program are you using to access the APIs?
-
I've had an idea in the Idea Exchange for a while to have Domo allow admins to just go directly to the page without having to request access to the page. Feel free to up vote it here: https://dojo.domo.com/main/discussion/53435/add-grant-access-button-to-pages-for-admins
-
Alternatively just allow admins to automatically load into the page without having to request access or have it added to the page list.
-
\r and/or \n should work but it appears they don't. You can explicitly pass an enter character in your forumla tile for the time being: REGEXP_REPLACE(`address`, '~', ' ')
-
Is it the querying that's taking a long time or the data upload to Domo that's taking a long time? Are you joining to another table causing the null values? If so when you removed the null column did you only exclude the field from the query itself, uncheck the columns to upload in the schema or remove the table join as…
-
In your Magic ETL - Group by the Employee, get the Max Sales, join that to your original dataset based on employee and sales amount. That will get you the date without having to use the aggregate on the date field.
-
Can you direct users to the same page / dashboard with a dataset containing all of the different users data but apply PDP to the dataset to filter it so that they only see their data and not anyone else's?
-
You can use a beast mode in analyzer or a formula tile in Magic ETL 2 and the COALESCE function to change NULL values to another value: COALESCE(`dt`, '2022-02-02')
-
@Jmoreno You can't do a mass update via the UI. If you wanted to write a script to interface with the APIs it would be an option but would require a very technical resource.
-
Because it's using a client ID and Secret and using the public API endpoints you'll likely need to set host domain to be api.domo.com
-
Hi @ravikirand You can sync an AppDB to a dataset in Domo. You need to set the syncEnabled value to true and define a dataset ID and schema. You can read up more on it here: https://developer.domo.com/docs/dev-studio-references/appdb
-
@willrnt The the BATCH_LAST_RUN is built into the API connector code to populate when the data is imported. You'd have to replicate that in your API program. An alternative is to utilize an ETL to update whenever your dataset is updated to automatically add the timestamp.
-
What browser are you using? Have you tried using a different browser?
-
The upload-s3 command doesn't support headers and will import the entire file. If you don't want the headers you could either remove the headers from the initial file or utilize an ETL to strip out the header information rows, update the column data types to the correct types and then append it to the main dataset.
-
You can utilize the create-dataset command in the CLI along with the -i parameter to specify the existing dataset's ID to update the schema of an existing dataset. Doing this will drop the column from your definition and would be unavailable moving forward. You could make a copy of your original dataset as a backup first…
-
You can't change the dataset schema using the front end. What should happen to the existing values in those two columns you're wanting to drop? Do you still need access to that data or are you wanting it to be completed removed permanently?
-
@andres I'd recommend logging a bug with Domo Support. That message shouldn't appear if there's no data. The No Data message should appear instead.
-
Hi @Sheeraz89 You'll need to join your two datasets together with a Magic ETL dataflow using the join tile based on the course ID (alternatively you can use a DataSet View and join the two tables together). This will combine the two datasets together and allow you to pull in the course name. Once you have that new dataset…
-
The script would have to be on the same server as the workbench installation for it to work. Alternatively you could have a separate script to ping the status of the SQL job to determine if it's done and then kick off the workbench process to ingest the data.
-
Workbench provides the wb.exe executable which can allow you to run jobs via a command line or a script. You can call it from your script once it's finished with the processing to then call workbench. You can read up on running Workbench from the command line here:…
-
There isn't a direct connection from Domo into Google Data Studio. You could try and write your own connector in GDS to attempt to communicate with Domo (https://developers.google.com/datastudio/connector/build)
-
Are you attempting to connect directly to the database via Domo? Did you whitelist the IP address on the Database server or your bastion server?
-
Hi @ozarkram I'd suggest utilizing a date offset dimension table to allow you to have the metrics for the prior year based on a specific date and then utilize a beast mode to bucket your numbers properly. I've done a writeup in the past on this here:…
-
Snowflake has a bit different syntax. Try something like: SELECT * FROM "TABLE" WHERE "ORDER_MODIFIED_TIME" > CURENT_DATE() - interval '7 day'
-
You'll have to talk with Domo Support to see if there's anything they can do to increase the timeout as it'd be a back end process that would need to be changed.