If possible, how can I call a Stored Procedure or Function?
I am converting some dashboard in Tableau to DOMO and have run in to some obvious differences between the two. I'm trying to figure out how to pass a user selected field from DOMO to a Stored Procedure or Function as an argument. Right now I am reading in the entire table of xx million rows and would like to be able to constrain the data as the SP was designed. I see a parameter option field but can't find any real documentation as to how to use it...if this is what it can be used for.
Thank you
Best Answer
-
@ricsanders and @Jose_Rojas Domo does not really have the concept of stored procedures or functions available for execution in Magic dataflows.
You can create stored procedures in MySql dataflows but i strongly recommend you avoid them because MySQL (v5.6) is significantly more effort to optimize and maintain if you're trying to hand dataflow pipeline design to other users.
for @ricsanders, we typically recommend building your datasets with all the users in it, then just use a Filter Card to select the user you want to display information for in your dasbhoard. -- I can appreciate that xx million might feel like a lot of wasted disk space if you have to read the entire table, but typically for Domo, millions of rows in a card is still going to yield acceptable end user experience performance-wise.
@Jose_Rojas you may need to handle your scheduling external to Domo if you have a snowflake procedure that needs to finish successfully BEFORE you ingest data into Domo. You might look at using Python to automate testing "did my snowflake procedure finish? if yes trigger Domo dataset execution."
Totally different use case, but here's an example of how you might build the LOOP structure.
Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"4
Answers
-
I have the same issue, but I'm coming from SSIS to DOMO, SSIS had the Execute SQL function, without the need of retrieving data. I need to run a Snowflake stored procedure from DOMO ETL and after that I'll use another connector to pull data from the table where the stored procedure pushed the data.
0 -
@ricsanders and @Jose_Rojas Domo does not really have the concept of stored procedures or functions available for execution in Magic dataflows.
You can create stored procedures in MySql dataflows but i strongly recommend you avoid them because MySQL (v5.6) is significantly more effort to optimize and maintain if you're trying to hand dataflow pipeline design to other users.
for @ricsanders, we typically recommend building your datasets with all the users in it, then just use a Filter Card to select the user you want to display information for in your dasbhoard. -- I can appreciate that xx million might feel like a lot of wasted disk space if you have to read the entire table, but typically for Domo, millions of rows in a card is still going to yield acceptable end user experience performance-wise.
@Jose_Rojas you may need to handle your scheduling external to Domo if you have a snowflake procedure that needs to finish successfully BEFORE you ingest data into Domo. You might look at using Python to automate testing "did my snowflake procedure finish? if yes trigger Domo dataset execution."
Totally different use case, but here's an example of how you might build the LOOP structure.
Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"4 -
@jaeW_at_Onyx wow, thanks, those videos of creating Jupyter notebooks inside Domo seem to be the solution. We were thinking about having a separate ETL process in Airflow and from there call Domo, but that will make us have to maintain both Domo and Airflow, and we prefer, if possible, just one.
So it will be:
use domo json connector to pull dataset from third party REST API ->
use snowflake writeback connector to push extracted data to our Datalake ->
use jupyter notebook in domo to connect to Snowflake and run the stored procedure ->
use normal Snowflake connector to pull resulting table created by the stored procedure into domo
Do we need to ask for the Domo Data Science package in order to be able to create a Jupyter notebook?
1 -
@Jose_Rojas yes, you would need to work with your AE to get the Jupyter Notebook addon. If it was me, i would be VERY explicit and adamant that you do NOT need the Data Science Package.
IMHO with the improvements in the Jupyter Data Science Notebook there is no reason to use the Scripting tiles.
But also i think you're building a couple of unnecessary hops in your pipeline.
Instead of using Domo connectors to pull data from your REST API, just do it in Python (unless it's super huge data).
instead of using writeback connector to push the data to your data lake. just use the Rest API in Python... unless you REALLY like the extra hops.
instead of running the stored Procedure in Snowflake ... yup you guessed it :D run it in Python.
Then use the dataset in Domo.
Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"2 -
@jaeW_at_Onyx Thanks a lot for your advice, I'm in an internal race competing against Airflow. But again, if I can prove Domo can do Jupyter, it will win the race, as with Airflow we need to support the VM, the linux and Airflow setup. With Domo we don't lose that many resources doing SysAdmin tasks.
1 -
@jaeW_at_Onyx I appreciate the response. We are unfortunately the consumers of the data and have nothing to do with that side of the data train! Our current solution has been to just hoover in ALL of the data with no constraints which is something like 27M records of historical data...takes like 30+ minutes to refresh. I think that we will soon be implementing a date range of nothing older than 90 days, so I think that will solve all of our problems!
Again, thank you.
1
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.8K Visualize
- 2.5K Charting
- 738 Beast Mode
- 56 App Studio
- 40 Variables
- 684 Automate
- 176 Apps
- 452 APIs & Domo Developer
- 46 Workflows
- 10 DomoAI
- 35 Predict
- 14 Jupyter Workspaces
- 21 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 123 Manage
- 120 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive