Comments
-
Thanks for the help guys. But from looking in to domo extensivly i have come to the personal conclusion the etl is very limited. you have to create several inefficient pipelines to create a simple task that can be carried out in Sql,Qlik,Dax. having 4 types of ETL tools in one application limits your possibility to improve…
-
well this is the solution domo has provided us i refer to the diffrent etls, by domo consultants. so recreating in one big etl is not an option we have 1.9 billion rows in the final table. the timestamp is a good idea but as domo only lets you run the etl once a day in the schedular, and we do a daily load it will not be…
-
so i tried via the Python API it only allows me to modify and update the datasets created via the Python API that i find limited. i also think there needs to be more documentation on the python side. i have seen several posts where there have been similar requests but no solution.
-
we have a limitaion on the number of rows in domo. so these temp tables are taking up our Quota. inefficient use of tables. they hold no value to the buisness
-
@jaeW_at_Onyx i dont want to delete the tables. i only want to delete the rows. this way i maintain the the dataflow and pipelines. inefficient because these are only temporary datasets and are not used in any Cards , they just take up space . in SQL i can create the same tables virtually in memory and dispose of them the…