-
Case when for 0-30, 31-60 aging buckets
I'm doing an accounts receivable visualization with a vertical bar graph, and i want to show the buckets for the invoice dates for 0-30 days 31-60 etc. but I can't get the graph working it only shows 91+ days: ( CASE WHEN DATEDIFF('day',CURRENT_DATE(),INVOICE_DATE) >= 0 AND DATEDIFF('day',CURRENT_DATE(),INVOICE_DATE) <= 30…
-
pydomo export_dataset - exporting Domo dataset with millions of rows failing
I have the below that exports a dataset to a string, I'm trying to export a dataset with 257 million rows but I don't see any logs as to why it fails It just fails! When I try the below with a smaller dataset (250k rows) I'm able to export, is there a way to programmatically break out the dataset into chunks? domo =…
-
with pydomo using DataSetClient data_export() to export dataset to string
dataset = DataSetClient() include_csv_header = True datasets_parent = dataset.data_export(dataset_id, include_csv_header) # datasets_parent = self.pydomo.datasets.data_export(dataset_id, include_csv_header) logging.info(f'datasets_parent: {datasets_parent}') return datasets_parent but I keep seeing missing 2 required…
-
Go to next page in List of Users?
I'm using Python and I'm able to get the first 500 users (which is the max according to the docs) but then how do I get the following 500, etc.. we have thousands! This is what I have so far: import requests import json url = 'https://api.domo.com/v1/users?limit=500' headers = {'Content-type': 'application/json', 'Accept':…
-
pydomo ds_query and query unable to use, Domo and Datasets object has no attribute
I have tried the following w. the same issue of object not found: domo = Domo(client_id, client_secret, logger_name='foo', log_level=logging.INFO, api_host=apihost) datasets_parent = domo.ds_query(...) # tried with domo.query() also same Domo object has no attribute ds_query error and I've tried: domo = Domo(client_id,…
-
pydomo ds_query - api limit for large dataset broken out into many queries
Is there documentation on how many queries can be made, and what the API rate limit is? I have an extremely large dataset and I have to break it out into like 100 hundred queries but after the 10th query is run it breaks.