How do I identify the second lowest value? I.e. MIN('COLUMN') +1?

I'm trying to sort my API calls by time and label the order in which they took place, however I'm not sure how to identify beyond the MIN time. Is there a way to identify the MIN and then look at the second or third or fourth value thereafter?

 

Ideally, my Beast Mode would then label each call with the order they came in by date and user.

 

Example:

MIN('TIME') THEN '1'

MIN('TIME')+1 THEN '2'

MIN(*TIME')+2 THEN '3'

ETC.

 

Thank you!

Best Answer

  • Unknown
    Answer ✓

    In order to get that numbering of ordered rows you're looking for, you'll need to use a dataflow. You can accomplish that result using any of the dataflow methods: Magic ETL, SQL (MySQL), or SQL (Redshift).

     

    If you prefer MySQL or Redshift, check out this doc: http://knowledge.domo.com?cid=rank. However, I recommend Magic ETL because it is the easiest to set up. 

     

    To create a column of ordered row numbers in Magic ETL:

    • Create a new Magic ETL dataflow
    • Drag an "Input Dataset" tile onto the canvas and configure it by selecting your source dataset
    • Drag a "Rank & Window" tile onto the canvas and configure it
      • Click the "Add Function" button
        • In box 1, name the output column (the column that will contain the row numbers)
        • In box 2, select "Row Number"
        • Click "Apply"
      • In box 2, select the column that should determine the row ordering
      • In box 3, select Ascending or Descending. Since you want to start with the earliest time, you'll need to use "Ascending"
      • Finally, box 4 is optional. If you wanted to partition the row numbering by one or more columns, this is where you would do it. For example, perhaps you want to row-number records for each API name independently, so you could identify the earliest call to API 1 separately from the earliest call to API 2. You could accomplish that result by selecting the API name field in box 4
    • Drag an "Output Dataset" tile onto the canvas and configure it by specifying the output dataset name.
    • Name, save and run your dataflow and you're done

     

    Now, you can build your visualization on your new output dataset, using your row number field as a filter or any other way you'd use any other column.

     

    Hope that gets you want you're looking for.

     

    Cheers

Answers

  • BlueRooster
    BlueRooster Domo Employee

    I'm fairly certain you'll need to use a SQL transform to make this happen. 

     

    Here's how i would approach it:

    If you're finding min time by ID then I would do something like this

    SELECT `ID`, DATE(`Time`) AS 'Day',  MIN(`Time`) AS 'MinTime'
    FROM dataset
    GROUP BY `ID`, DATE(`Time`)

    Then use this to create a flag for your original dataset

    SELECT a.*, CASE WHEN b.`MinTime` = a.`Time` THEN 1 ELSE 0 END AS 'MinFlag'
    FROM dataset AS a LEFT JOIN transform1 AS b ON b.`ID` = a.`ID` AND DATE(a.`Time`) = DATE(b.`Time`)

     

    Now using this dataset in your card you can create a BeastMode that goes:

    MIN(CASE WHEN `MinFlag` = 0 THEN `Time` END)

    So this would create a flag for each id by day for the minimum time found and then your beastmode says when time is not equal to a minimum time return all the times and find the minimum of the returned values.

     

    Hope that helps get you what you need,

    Valiant

     

    **Please mark "Accept as Solution" if this post solves your problem
    **Say "Thanks" by clicking the "heart" in the post that helped you.

  • If this is a data set that you are analyzing, it seems like you could utilize the ROW_NUMBER function. When you use the function to order by date in ascending order, it would make your earliest date have a row number of 1 and count up from there. 

     

    Additonally, if there are batches of data you are pulling in, you could use the PARTITION BY argument to have the row numbers start over at each day, batch number, etc. which would let you analyze something like "at what time are the second calls made?" over time.

     

    Check out Amazon Redshift documentation here:

     

    https://docs.aws.amazon.com/redshift/latest/dg/r_WF_ROW_NUMBER.html

     

    And be sure to check out the example:

     

    https://docs.aws.amazon.com/redshift/latest/dg/r_Examples_of_WF_ROW_NUMBER_WF.html

     

     

  • Unknown
    Answer ✓

    In order to get that numbering of ordered rows you're looking for, you'll need to use a dataflow. You can accomplish that result using any of the dataflow methods: Magic ETL, SQL (MySQL), or SQL (Redshift).

     

    If you prefer MySQL or Redshift, check out this doc: http://knowledge.domo.com?cid=rank. However, I recommend Magic ETL because it is the easiest to set up. 

     

    To create a column of ordered row numbers in Magic ETL:

    • Create a new Magic ETL dataflow
    • Drag an "Input Dataset" tile onto the canvas and configure it by selecting your source dataset
    • Drag a "Rank & Window" tile onto the canvas and configure it
      • Click the "Add Function" button
        • In box 1, name the output column (the column that will contain the row numbers)
        • In box 2, select "Row Number"
        • Click "Apply"
      • In box 2, select the column that should determine the row ordering
      • In box 3, select Ascending or Descending. Since you want to start with the earliest time, you'll need to use "Ascending"
      • Finally, box 4 is optional. If you wanted to partition the row numbering by one or more columns, this is where you would do it. For example, perhaps you want to row-number records for each API name independently, so you could identify the earliest call to API 1 separately from the earliest call to API 2. You could accomplish that result by selecting the API name field in box 4
    • Drag an "Output Dataset" tile onto the canvas and configure it by specifying the output dataset name.
    • Name, save and run your dataflow and you're done

     

    Now, you can build your visualization on your new output dataset, using your row number field as a filter or any other way you'd use any other column.

     

    Hope that gets you want you're looking for.

     

    Cheers

  • Thank you all! Many of these will work but the solution with the ETL was the easiest. Thank you for the help!