Append not finding new rows after initial import

bdavis
bdavis Contributor

Ok, so I'm using a date field in my replacement variable. I initially set the value equal to a date at the beginning of my range (right before the range starts in the data). I include the date field in my SELECT, WHERE, and ORDER BY. In my WHERE I added another date to be the cap to try and troubleshoot the issue of not updating (... AND resolve_time < '2019-04-01'). When I run the job, it works. The proper data is imported into Domo, the replacement variable updates its value - life is good. I then remove the date cap from my WHERE clause, change it to APPEND and re-run the job. This should find the rows that exist with April dates. However, no rows are processed this time, and the replacement value doesn't change.

 

Any ideas on this? Anyone run into this issue and was successful in getting around it? I'm stumped.

Best Answer

  • bdavis
    bdavis Contributor
    Answer ✓

    @cwolman wrote:

    Could you share your sql?


    Sure!

     

    SELECT i.id, i.resolve_time, u.department_id FROM issues i
    left join users u on i.uid=u.id
    WHERE i.resolve_time > '!{lastvalue:resolve_time}!'
    ORDER BY i.resolve_time ASC

     

    EDIT: I feel like it's not interpreting the reaplcement value properly, comparing it to the datetime variable of resolve_time and saying "Yup, you're greater than this undetermined replacement value". The replacement value stores as "1/1/2010 4:00:00 PM". I tried using STR_TO_DATE with no luck (the replacement value doesn't update), I've tried reformatting my resolve_time variable to meet the date format the replacement variable is using... After trying many different things, I'm now wondering what I've overlooked in my haste to resolve this.

     

    2ND EDIT: I found a solution (I believe). SQL is below.

    SELECT i.id, i.resolve_time, u.department_id FROM issues i
    left join users u on i.uid=u.id
    WHERE i.resolve_time > DATE_FORMAT(STR_TO_DATE('!{lastvalue:resolve_time}!', '%c/%e/%Y %l:%i:%s %p'),'%Y-%m-%d %H:%i:%s')
    ORDER BY i.resolve_time ASC

Answers

  • I feel like there is an echo here. I think this is the same issue that was brought up by @cwolman earlier today. If the input datasets are not changed when a data flow is scheduled to update, it won’t run. I would recommend adding an input data set that does update at the interval that you need  youdont even need to include any fields from the new data set and that should fix your issue


    “There is a superhero in all of us, we just need the courage to put on the cape.” -Superman
  • bdavis
    bdavis Contributor

    This is a workbench job, and I noticed the issue because the source data did change, but the workbench import did not pick up the new rows. 

  • cwolman
    cwolman Contributor

    Could you share your sql?


    -----------------
    Chris
  • bdavis
    bdavis Contributor
    Answer ✓

    @cwolman wrote:

    Could you share your sql?


    Sure!

     

    SELECT i.id, i.resolve_time, u.department_id FROM issues i
    left join users u on i.uid=u.id
    WHERE i.resolve_time > '!{lastvalue:resolve_time}!'
    ORDER BY i.resolve_time ASC

     

    EDIT: I feel like it's not interpreting the reaplcement value properly, comparing it to the datetime variable of resolve_time and saying "Yup, you're greater than this undetermined replacement value". The replacement value stores as "1/1/2010 4:00:00 PM". I tried using STR_TO_DATE with no luck (the replacement value doesn't update), I've tried reformatting my resolve_time variable to meet the date format the replacement variable is using... After trying many different things, I'm now wondering what I've overlooked in my haste to resolve this.

     

    2ND EDIT: I found a solution (I believe). SQL is below.

    SELECT i.id, i.resolve_time, u.department_id FROM issues i
    left join users u on i.uid=u.id
    WHERE i.resolve_time > DATE_FORMAT(STR_TO_DATE('!{lastvalue:resolve_time}!', '%c/%e/%Y %l:%i:%s %p'),'%Y-%m-%d %H:%i:%s')
    ORDER BY i.resolve_time ASC

This discussion has been closed.