コメント
-
Is it the querying that's taking a long time or the data upload to Domo that's taking a long time? Are you joining to another table causing the null values? If so when you removed the null column did you only exclude the field from the query itself, uncheck the columns to upload in the schema or remove the table join as…
-
In your Magic ETL - Group by the Employee, get the Max Sales, join that to your original dataset based on employee and sales amount. That will get you the date without having to use the aggregate on the date field.
-
Can you direct users to the same page / dashboard with a dataset containing all of the different users data but apply PDP to the dataset to filter it so that they only see their data and not anyone else's?
-
You can use a beast mode in analyzer or a formula tile in Magic ETL 2 and the COALESCE function to change NULL values to another value: COALESCE(`dt`, '2022-02-02')
-
@Jmoreno You can't do a mass update via the UI. If you wanted to write a script to interface with the APIs it would be an option but would require a very technical resource.
-
Because it's using a client ID and Secret and using the public API endpoints you'll likely need to set host domain to be api.domo.com
-
Hi @ravikirand You can sync an AppDB to a dataset in Domo. You need to set the syncEnabled value to true and define a dataset ID and schema. You can read up more on it here: https://developer.domo.com/docs/dev-studio-references/appdb
-
@willrnt The the BATCH_LAST_RUN is built into the API connector code to populate when the data is imported. You'd have to replicate that in your API program. An alternative is to utilize an ETL to update whenever your dataset is updated to automatically add the timestamp.
-
What browser are you using? Have you tried using a different browser?
-
The upload-s3 command doesn't support headers and will import the entire file. If you don't want the headers you could either remove the headers from the initial file or utilize an ETL to strip out the header information rows, update the column data types to the correct types and then append it to the main dataset.
-
You can utilize the create-dataset command in the CLI along with the -i parameter to specify the existing dataset's ID to update the schema of an existing dataset. Doing this will drop the column from your definition and would be unavailable moving forward. You could make a copy of your original dataset as a backup first…
-
You can't change the dataset schema using the front end. What should happen to the existing values in those two columns you're wanting to drop? Do you still need access to that data or are you wanting it to be completed removed permanently?
-
@andres I'd recommend logging a bug with Domo Support. That message shouldn't appear if there's no data. The No Data message should appear instead.
-
Hi @Sheeraz89 You'll need to join your two datasets together with a Magic ETL dataflow using the join tile based on the course ID (alternatively you can use a DataSet View and join the two tables together). This will combine the two datasets together and allow you to pull in the course name. Once you have that new dataset…
-
The script would have to be on the same server as the workbench installation for it to work. Alternatively you could have a separate script to ping the status of the SQL job to determine if it's done and then kick off the workbench process to ingest the data.
-
Workbench provides the wb.exe executable which can allow you to run jobs via a command line or a script. You can call it from your script once it's finished with the processing to then call workbench. You can read up on running Workbench from the command line here:…
-
There isn't a direct connection from Domo into Google Data Studio. You could try and write your own connector in GDS to attempt to communicate with Domo (https://developers.google.com/datastudio/connector/build)
-
Are you attempting to connect directly to the database via Domo? Did you whitelist the IP address on the Database server or your bastion server?
-
Hi @ozarkram I'd suggest utilizing a date offset dimension table to allow you to have the metrics for the prior year based on a specific date and then utilize a beast mode to bucket your numbers properly. I've done a writeup in the past on this here:…
-
Snowflake has a bit different syntax. Try something like: SELECT * FROM "TABLE" WHERE "ORDER_MODIFIED_TIME" > CURENT_DATE() - interval '7 day'
-
You'll have to talk with Domo Support to see if there's anything they can do to increase the timeout as it'd be a back end process that would need to be changed.
-
@JasonAltenburg It's a Trellis Chart. You can read more on those here: https://domohelp.domo.com/hc/en-us/articles/360043428713-Applying-DataSet-Columns-to-Your-Chart#6.
-
Can you just use additional segments in your group by to include the month and collection type?
-
Does the former owner still work there or do you just need to change ownership? If you want to change ownership of the user's cards to yourself you can use the Java CLI and the swap-owner command and have it swap just the cards owned by the original owner to yourself.
-
I've written up a very simplistic example of adding business days which doesn't take into account holidays or when the business may be closed. @Gordon_Pont is a more correct solution to account for these holidays and weekends but wanted to share how you can roughly do it with a formula:…
-
Hi @bradw Due to how filters function within Domo you can't have a single filter column and multiple values applying to different cards. You could copy the date value into a different column and then have the other cards use the new column instead of the first column so they're separate values to filter on. Because you're…
-
Typically I'll do the same, pull in the raw tables with the actual field names then consolidate the business logic and renaming of the columns to be a standard value in the ETL (Magic or View) this way the same name is used across all cards and I have my actual raw tables with no modifications for easier debugging.
-
You could utilize the Domo Data Governance Connector and pull in the Card Fields and Beast Modes dataset which will list all the fields used on a card and has identifiers if the field is a beast mode or not.
-
Are you wanting to manually set this column or should it be getting pulled from your pendo connection?
-
Hi @akeating I'd recommend you reformat your data in a friendlier way to allow you to more easily calculate Period over Period difference. I've done several write ups on this in the past with step by step instructions on how to accomplish this:…
