How to create a backlog trend card?

I'm trying to create a card that shows the trend of case completion against a backlog each month (based on open/close dates).  Ideally, it would be a multi-line chart with 'month-yr' on the X-axis and COUNT of cases on the Y-axis, for the following series:

  • New =  COUNT of cases opened during month
  • Closed = COUNT of cases closed during month
  • Backlog = COUNT of total remaining open (no close date) at end of the month

Is this possible?  Not really sure how to tackle the aggregation of the backlog each month with beast mode.

Tagged:

Best Answer

  • Billobi
    Billobi Contributor
    Answer ✓

    I’d do some additional ETL before doing this visualization.  There are two issues that would make date-based beast gymnastics in the card a pain:

     

    • Open.Date and Close.Date are likely two different columns in your dataset (assuming each row represents a Case).  The x-axis can only look at one column worth of dates, which we need to specify in a beast or an ETL.  I'd suspect it’s possible for a Case to open and close in the same month, and you’d want the count of that case reflected in both Open and Closed counts.
    • The count of cases that are open at the end of each month is a point-in-time value that should always remain static from end of month onward.  I’ve been much happier with point-in-time info by doing a snapshot (monthly scheduled dataset refresh) on the last day of each month and appending the count into the primary dataset.  Otherwise you get into a lot of logic that continually assesses whether the case was open at a point in time...it's ugly.  Plus, I often get questions about “what was the backlog at end of Month X” and it’s easier to go back and look at prior snapshots then try to recreate history.

     

    Here’s a review of the ETL approach I’d take to take to tackle these two issues.

    1. Create new ETL.
    2. Start with your cases dataset that runs on regular intervals as the input dataset.
    3. Branch it right away into two parallel paths with the Filter Rows transform.  One of them will filter on Open.Date “is not null”, the other will filter on Closed.Date “is not null”.
    4. Use Add Constants transform in each to create a new identically named column called “Row Status”.  Set the value to Text  and call it “Opened” for the open filtered path and “Closed” for the closed filter path.  Also make a constant of “1”, call that column “Key Count”.
    5. Next use Value Mapper transform in each path to create a new column named “Key Date”.  In the opened path, it makes that new column based on the Open.Date.  In the closed path, it makes the new column based on Closed.Date.  (note – you could also use Select Columns to just change the names of these columns, but I prefer Value Mapper so you can keep the original data intact for better drill-through capability later on)
    6. At this point, you’ve now got an opened path in your ETL that has a row count (and sum of Key Count) equivalent to all the cases that have been opened and a closed path that has a row count equivalent to all the cases that have been closed.  You then append the two paths back together and output the dataset. 

    You could now use “Key Date” as your x-axis, Sum of Key Count as your y-axis, and “Row Status” as your series.  Make sure this is treating you right, then proceed to the next piece which adds the historical snapshot.

     

    1. Make another dataset for “Case Snapshot” that’s identical to the Case dataset you used as your input in the ETL above, and set it to run late night on the last day of every month.  Be sure it’s Scheduling is setup to use the Append method instead of the Replace method. 
    2. Add "Case Snapshot" as a new input to the ETL above. 
    3. Apply a Filter transform to only include rows where Closed.Date “is null”. 
    4. Add Constants for a text column called "Row Status" and set to a value of  “EOM Open Cases”, and add another column called "Key Count" with a value of “1”. 
    5. Then use Value Mapper to create a “Key Date” column based on BATCH_DATE. 
    6. Lastly, take this path and add it as a 3rd input to your append.  (if append isn't lining up the date properly, you may need to also use a Column Type transform to ensure the date format of BATCH_DATE is similar to Open.Date and Closed.Date)

     

    There are other ways to do this using Group By to assemble counts for various conditions, but it wouldn’t allow a user to drill to any details. 

     

    If you need to recreate the past EOM Open Cases counts and aren’t worried about drills, you could add a simple webform to the ETL & append with each row representing an EOM period.  You’d need to make sure the column types matched so the append worked properly, but it would just be a Row Status of “EOM Open Cases”, a Key Count of whatever value you had logged as open EOM case count, and a Key Date of the last day of each month. 

     

    Good luck!

    Bill

Answers

  • Billobi
    Billobi Contributor
    Answer ✓

    I’d do some additional ETL before doing this visualization.  There are two issues that would make date-based beast gymnastics in the card a pain:

     

    • Open.Date and Close.Date are likely two different columns in your dataset (assuming each row represents a Case).  The x-axis can only look at one column worth of dates, which we need to specify in a beast or an ETL.  I'd suspect it’s possible for a Case to open and close in the same month, and you’d want the count of that case reflected in both Open and Closed counts.
    • The count of cases that are open at the end of each month is a point-in-time value that should always remain static from end of month onward.  I’ve been much happier with point-in-time info by doing a snapshot (monthly scheduled dataset refresh) on the last day of each month and appending the count into the primary dataset.  Otherwise you get into a lot of logic that continually assesses whether the case was open at a point in time...it's ugly.  Plus, I often get questions about “what was the backlog at end of Month X” and it’s easier to go back and look at prior snapshots then try to recreate history.

     

    Here’s a review of the ETL approach I’d take to take to tackle these two issues.

    1. Create new ETL.
    2. Start with your cases dataset that runs on regular intervals as the input dataset.
    3. Branch it right away into two parallel paths with the Filter Rows transform.  One of them will filter on Open.Date “is not null”, the other will filter on Closed.Date “is not null”.
    4. Use Add Constants transform in each to create a new identically named column called “Row Status”.  Set the value to Text  and call it “Opened” for the open filtered path and “Closed” for the closed filter path.  Also make a constant of “1”, call that column “Key Count”.
    5. Next use Value Mapper transform in each path to create a new column named “Key Date”.  In the opened path, it makes that new column based on the Open.Date.  In the closed path, it makes the new column based on Closed.Date.  (note – you could also use Select Columns to just change the names of these columns, but I prefer Value Mapper so you can keep the original data intact for better drill-through capability later on)
    6. At this point, you’ve now got an opened path in your ETL that has a row count (and sum of Key Count) equivalent to all the cases that have been opened and a closed path that has a row count equivalent to all the cases that have been closed.  You then append the two paths back together and output the dataset. 

    You could now use “Key Date” as your x-axis, Sum of Key Count as your y-axis, and “Row Status” as your series.  Make sure this is treating you right, then proceed to the next piece which adds the historical snapshot.

     

    1. Make another dataset for “Case Snapshot” that’s identical to the Case dataset you used as your input in the ETL above, and set it to run late night on the last day of every month.  Be sure it’s Scheduling is setup to use the Append method instead of the Replace method. 
    2. Add "Case Snapshot" as a new input to the ETL above. 
    3. Apply a Filter transform to only include rows where Closed.Date “is null”. 
    4. Add Constants for a text column called "Row Status" and set to a value of  “EOM Open Cases”, and add another column called "Key Count" with a value of “1”. 
    5. Then use Value Mapper to create a “Key Date” column based on BATCH_DATE. 
    6. Lastly, take this path and add it as a 3rd input to your append.  (if append isn't lining up the date properly, you may need to also use a Column Type transform to ensure the date format of BATCH_DATE is similar to Open.Date and Closed.Date)

     

    There are other ways to do this using Group By to assemble counts for various conditions, but it wouldn’t allow a user to drill to any details. 

     

    If you need to recreate the past EOM Open Cases counts and aren’t worried about drills, you could add a simple webform to the ETL & append with each row representing an EOM period.  You’d need to make sure the column types matched so the append worked properly, but it would just be a Row Status of “EOM Open Cases”, a Key Count of whatever value you had logged as open EOM case count, and a Key Date of the last day of each month. 

     

    Good luck!

    Bill

  • Thank you so much for the very detailed solution.  I was finally able to spend the time working through it and for the most part everything works.  3 things to mention:

    1. Value mapper doesn't work for making the Key Date columns.  Originally, I was able to do this with Combine Columns and Convert Type to Date/Time.  However, I ended up switching to use Date Operation (since I needed it for the next item) and just add 0 days for the transform - which gives the same result but doesn't require the type conversion step.
    2. Using the batch date for the backlog makes sense but poses some additional challenges depending on timezones.  I was able to use a date operation to address this.
    3. I do care about historical, so i'll need to figure out the best way to get that data together.
This discussion has been closed.