Certainty Effect 
Choose one row from each column
(D1) Winning $50 with probability .5 
(D2) Winning $30 with probability .7 
(E1) Winning $50 with probability .8 
(E2)  Winning $30 with probability 1.0 
Consider a jar with 50 green balls, 30 yellow balls, and 20 red balls.

Green 
(50)
Yellow 
(30)
Red (20)
D1 $50 $0 $0
D2 $30 $0 $30

Green 
(50)
Yellow 
(30)
Red (20)
E1 $50 $50 $0
E2 $30 $30 $30

Most people choose D1 and E2.
but the payoff for yellow favors E1 over E2, but does not favor D1 or D2.
Going from Game D to Game E only makes option 1 more favorable..


In terms of utility:
If you prefer D1 to D2 then  .5D(50) > .7D(30) or D(30) < .7D(30)
If you prefer  E2 to E1 then  .8D(50) <    D(30) or D(30) > .8D(30)
Either one can be rational depending on your risk aversion, bot they can't both be rational for the same person.


The explanation is that there is a preference for certainty separate from the utility of the payoff.  D1 &E1, D2&E2, and D2&E1 can all be explained with different utility functions, but explaining the most popular choice cannot.