2 Issues: MySQL SSH - Query Parameter and STREAM API
Sorry for combo post but I got problems.
1. I'm using the MySQL SSH Connector - I'm using the QUERY option to pull in data from a table. Currently, most are just daily imports but some tables are 1B rows and so doing a 'Select * from table;' is VERY inefficient. What I'd like to do is using a parameter to only pull in the most recent data (if the source table has a date field or if the source field has a PK ID that I could use). It looks like the query parameter option was intended to do that, but I don't see how it would store a variable so it could reference it when it imported. I know the other option here is to just do a recurse DF but it will slow me down so I'm trying to reduce waste where I can.
2. Am I right in thinking that this may be a good case for Stream API? It looks in the documentation that it only works for very large flat files, but that doesn't make sense to me...so I'm hoping I'm wrong with that. I'm not a programmer either so any helpful advice here is VERY much appreciated.
Thanks everyone
Matt
Best Answer
-
Answers:
1. no stored system variable - must find it myself using a date/id field and go from there....build recursive to just filter out the dupes.
2. Maybe - but you have to build code to export source data to a flat file to then import in your stream which sucks.
Alternative: There is an option for data assembler for big datasets -going to try it out (2b row table may need it). Also I got workbench working thru an SSH connector (putty/odbc/port forwarding) and that works sometimes ok...but seems to fail at the most inopportune times w/o good error messages. - closing this up.
0
Answers
-
Answers:
1. no stored system variable - must find it myself using a date/id field and go from there....build recursive to just filter out the dupes.
2. Maybe - but you have to build code to export source data to a flat file to then import in your stream which sucks.
Alternative: There is an option for data assembler for big datasets -going to try it out (2b row table may need it). Also I got workbench working thru an SSH connector (putty/odbc/port forwarding) and that works sometimes ok...but seems to fail at the most inopportune times w/o good error messages. - closing this up.
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 298 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.8K Visualize
- 2.5K Charting
- 729 Beast Mode
- 54 App Studio
- 40 Variables
- 678 Automate
- 173 Apps
- 451 APIs & Domo Developer
- 46 Workflows
- 8 DomoAI
- 34 Predict
- 14 Jupyter Workspaces
- 20 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 121 Manage
- 118 Governance & Security
- Domo Community Gallery
- 33 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive