Exporting Large Datasets by date?

Hi I have a specific use case and I am afraid I have hit a limitation of domo. I need to export a dataset and have no access to the cloud. I have too many rows for Domo to handle and times out every time I try run a simple query like select * from dataset where date - 'sample date'. I am looking to break this huge data set up into multiple different cards (one time operation)

 

I only have my computer 16g ram. My question without running some type of scaling cloud enviroment how can I get this data out and into another card as it fails in platform? 

Comments

  • Hello,

     

    Have you tried using Redshift queries in Domo to produce your datasets? How much data are we talking about here?

     

    Thanks,

     

    Brian


    **Please mark "Accept as Solution" if this post solves your problem
    **Say "Thanks" by clicking the "heart" in the post that helped you.
  • Hi  I tried simple queries but nothing works and times out. Have you a suggestion on querying pieces of the dataset? Something similiar to querying partitions in big query would be nice.

     

    I have a few billion rows

     

    Thanks and much appreciated

  • Without seeing the actual history of your dataflow, I am assuming the timeout is happening on the actual loading of the dataset, not on the query itself, correct?

     

    If this is the case you will need to break up the raw dataset before uploading it to Domo. You could also try creating a Domo support ticket and see if they have some other alternative. 

     

    Thanks,

     

    Brian


    **Please mark "Accept as Solution" if this post solves your problem
    **Say "Thanks" by clicking the "heart" in the post that helped you.
  • how can I break it up? That's exactly what I am trying to do but it doesn't work. I tried seperating into cards with no luck


    @Property_Ninja wrote:

    Without seeing the actual history of your dataflow, I am assuming the timeout is happening on the actual loading of the dataset, not on the query itself, correct?

     

    If this is the case you will need to break up the raw dataset before uploading it to Domo. You could also try creating a Domo support ticket and see if they have some other alternative. 

     

    Thanks,

     

    Brian



    @Property_Ninja wrote:

    Without seeing the actual history of your dataflow, I am assuming the timeout is happening on the actual loading of the dataset, not on the query itself, correct?

     

    If this is the case you will need to break up the raw dataset before uploading it to Domo. You could also try creating a Domo support ticket and see if they have some other alternative. 

     

    Thanks,

     

    Brian


     

  • How are you importing the data into Domo? You will need to break up the data before you even import it into Domo. 

     

    Thanks,

     

    Brian


    **Please mark "Accept as Solution" if this post solves your problem
    **Say "Thanks" by clicking the "heart" in the post that helped you.
  • So yea that is the problem. Going foward it is no issue to break it up. The problem is the data is already in there and nowhere else so I need the data to come out of Domo. I can fix it very easy coming in but the data loss is the issue I am trying ot find a solution to. Any ideas?

     

    Thanks very much

  • Hmm ... the only thing I can think of that might work is downloading the file from Domo as a CSV and then using Domo Workbench to load the data back into Domo via CSV. You can upload it in chuncks using the "Ignore Starting Rows" and "Ignore Ending Rows" under "Source." 

     

     

    Hopefully this helps,

     

    Brian


    **Please mark "Accept as Solution" if this post solves your problem
    **Say "Thanks" by clicking the "heart" in the post that helped you.
This discussion has been closed.