Datafusion Experiences and Limitations?
I'm trying to find ways to optimize the performance of my data flows and one thing I've been thinking about is leveraging datafusions to connect data.
Basically I have a base data set from our admin that adds a new day of data each day, about 65k rows of data per day. So after 15 months of data (beginning of 2017) we're at around 30 million rows of data. In an attempt to improve performance I want to limit what has to process each day and my thought is about removing all of our salesforce data from this base data set and then using datafusion to combine it. The salesforce data is about 2000 rows of unique information that is mapped to the larger 30 million rows and takes a few hours to do right now utilizing ETL. Has anyone used datafusion to combine large amounts of data to a smaller data set in this fashion and how was overall performance when doing this? If it works for a larger data set like this, I may reconfigure all of our datasets in this fashion so it would be great to get insights into other people's experiences. I figure reducing the base data set by 15 columns will improve performance and then adding the columns per Datafusion would reduce overall processing time.
Also, I assume Datafusion data sets cannot be used in other ETLs or Redshift but confirmation on that would be helpful as well.
Thirdly, why can't we get a Datafusion which inserts the data onto an existing data set but can run on a loop? This would be for adding rows as opposed to columns obviously.
Comments
-
Hi,
Is anyone able to help out with this request?
Thanks,0 -
Anyone have experience using Datafusion?
0 -
I have not had experience with datafusion. However, have you considered splitting up your main data flow? Rather than rewriting all 30 million rows, have you considered an append? I'm assuming you are adding ~65k rows each day. Do the historical rows change as well? If not, you could run the data flow only for the most recent day and then, in a seperate dataflow, append the output to your complete data set.
0 -
I'm already appending the base data set on top of eachother. That data will never change. Using data fusion is what i think would make sense for attaching the descriptives that can change over time. Contractual terms, ownership of entity, who is working that account are all things that can change from day to day.
0 -
So I guess the question now becomes, do you want to know what those descriptives are currently, or at the time in question.
My suggestion is to take your 65k rows each day and run them through the data flow that will attach these "descriptives" from Salesforce. You would then have a complete data set for that day. You would then append that with the full data set. By doing this you would be able to see who was working an account on the day in question, rather than just who is currently working on the account. Also, because you would only be running the joins on a single day's worth of data at a time, it should not take as long to process.
0 -
Just current. Historicals don't really matter to current analysis but I have logged historical and current in the past but moving to just current as no one has a need for past book of business.
0 -
Have you attempted this with a datafusion yet? As I mentioned earlier, I don't have much experience with that tool. However, for data sets of this size, I would tend to rely on a redshift data flow over the ETL. The limitations of the datafusion (output cannot be used in another dataflow and you cannot apply PDP rules to a data fusion) have prevented me from using it much.
0 -
Totally understand those limitations.
1. we don't use PDP
2. since the base data set will be its own dataflow that can be repurposed I'm indifferent to the other limitation you speak of.
0 -
I would say go for it then.
Create a new data fusion to join the two data sets and compare. I'm curious how the two would perform compared with one another. If you do get it working, please post how the datafusion performs versus the ETL data flow.
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 102 SQL DataFlows
- 626 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 753 Beast Mode
- 61 App Studio
- 41 Variables
- 692 Automate
- 177 Apps
- 456 APIs & Domo Developer
- 49 Workflows
- 10 DomoAI
- 38 Predict
- 16 Jupyter Workspaces
- 22 R & Python Tiles
- 398 Distribute
- 115 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 130 Manage
- 127 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 11 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 110 Community Announcements
- 4.8K Archive