Sum Values based on duplicate unique values in another column

Afternoon all, 

 

Having a hard time figuring out this one out. I have a tabular report i'm connecting to in salesforce, which is a "Campaigns with Influenced Opportunities" report. The report shows a line item for all campaigns that contain a member with an opportunity. The member can be associated to 1 opportunity, but be a member of 3 campaigns, which means that opportunity is listed in 3 line items (Member A, Opportunity 1, Campaign Alpha,beta,zeta)...  The opportunity ID is unique, so for every line item with the same opportunity ID i'd like to return the opportunity Amount only once (instead of 3 times).

 

my answer should be: 

Opportunity Total = Sum(Unique ID("Opportunity Amount")) / Sum(Count of that ID(("Opportunity ID"))

however, it must be done at the individual oppty level. If you sum all the oppty amount, then divide by count of ID's, it's wrong because each opportunity can be a different amount. and the right amount needs to be divided by the right count.

 

Example:

4000 = (Sum of amount for Oppty ID for John / Count of Oppty ID for John) + (Sum of amount for oppty id for jane / count of oppty id for Jane) + etc. etc. etc.

 

MemberCampaignOppty ID AmountActual
JohnCamp1123a$1,000.00$1,000.00
JohnCamp2123a$1,000.00
JaneCamp374k1$1,000.00$1,000.00
JosieCamp484kd$1,000.00$1,000.00
SamCamp5923j$1,000.00$1,000.00
Total  $5,000.00$4,000.00

 

Hope this makes sense, any help is appreciated, thank you!

Comments

  • added note, when I create my bar graph as a stack with the unique opportunity ID as a series across created dates for the quarter, then the correct totals per date range will be there. 

  • For ease of use, I would restructure the data

    Such that I have

    data1 = one row per campaign.

    data2 = one row per opportunity

     

    then APPEND them instead of JOIN them.  By JOIN'ing the data you are currently spreading Opportunity Amount across each associated campaign (hence why you're trying to do 'crazy math'.  If you APPEND the data, then you don't have to worry about crazy math and can instead take easy sum or division.

     

    In this video https://www.youtube.com/watch?v=I3y-LSch-hM&list=PLUy_qbtzH0S4CkHBUvpOVpLNJluk6upOn&index=2&t=5s I describe how to build a STACK'ed dataset (my name for datasets that have been UNIONed.)  Instead of forecasts and actuals, you'd just append opptys and campaigns.

    Jae Wilson
    Check out my 🎥 Domo Training YouTube Channel 👨‍💻

    **Say "Thanks" by clicking the ❤️ in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"
  • I'm trying to do this same thing. I understand the append row and how it helps not blow up the data, but don't understand how that would help in this case.

    You would then have:

    Oppty ID

    Stage

    Pipeline $

    Campaign

    Influence

    123a

    Discover

    $1,000

    123a

    Camp1

    100%

    74k1

    Negotiation

    $1,000

    74k1

    Camp2

    50%

    923j

    Won

    $10,000

    923j

    Camp3

    100%

    So now when I try to build a report that looks at the pipeline$ (or any $ Amounts) associated to the influenced campaign, it would show $0.

  • ST_Superman
    ST_Superman Domo Employee

    From your original data table, you could try something like this:

    sum(sum(DISTINCT Amount) fixed (by Member,OpptyID))

    that gives me this

  • I guess it all depends on whether you consider the value of an opp being spread evenly between the campaigns or not. What would do, is in ETL create a calculated field to spread the Opp amount, so that distribution will not be affected by any filters at the view level.

    I'd add a Group By tile, to perform a Count of Campaings per Opportunity, join this back and then do Formula tile to have Distributed Amount be equal the the Amount/Count.

    If the spread is not even, they you'd need to know how you determine how to spread it, either by campaign weight, participation % or anything along that line and do a weighted distribution instead.