I’ve just finished reading a book I got for Christmas (thanks, Phil) by Michael Lewis called The Undoing Project. If you’ve read any of Michael Lewis’s other books, such as Moneyball. Liar’s Poker, or The Big Short, you’ll know he’s a terrific writer. In his new one, his writing is every bit as good and his research is every bit as thorough. It’s a subject that some readers may find a little bit “out there” compared to baseball or the world of finance, but it’s right up my alley. This story chronicles the extraordinary research collaboration between two psychologists who went on to win the Nobel Prize in Economics, Amos Tversky and Daniel Kahneman. The work they were recognized for is in fact applicable in many, many domains, but it up-ended the long held assumption of economists that human beings are rational in their decision making, hence the Nobel Prize in that subject.
The Undoing Project is fascinating on three levels: it gives a captivating look into the intense and remarkably productive collaboration of these two researchers over decades; it gives some insight into how research works, which is a complete mystery to many people; and, it provides compelling explanations of the theories developed by these two men in studying how people make decisions based on their knowledge and their intuition. It seems that we aren’t so rational after all, and we definitely don’t always make decisions that are in our own best interests.
[I should issue a disclaimer here for those of you who remain sceptical that this sounds like a book you may want to read; I had studied some of the work of these men long ago and was already a fan. So I am biased for sure, but I still recommend it as a fascinating read for newbies to the topic. Especially for recreational gamblers!]
Kahneman and Tversky’s experimental method consisted of presenting subjects with a range of creative scenarios and then asking what decisions they would make in these situations. Their results shook the established assumptions; it took naysayers whose own work was put in question by these results a long time to come around to accepting their findings. It turns out that their findings explain a lot about what goes on in our world every day, including our day-to-day decisions. Let’s take a look at just a few of them.
- People’s decisions/preferences when faced with the possibility of a loss are quite different from decisions made when the risk involves a win, even when the odds are the same and/or the effect is the same. This is called loss aversion. It turns out that people are more upset by losing, say, $1000, than they are pleased to win the same amount of money. In other words, the pain of losing is greater than the satisfaction of winning. People will put in $5 towards a 50/50 draw because they’re not going to lose much and might gain a big pot; even though the chances are extremely small, the risk is small as well. But when there is much more at stake, say $500, even when the likelihood of success is very high, say a 90% chance of turning the $500 into $1000 and a 10% chance of $0, the risk of loss overtakes the opportunity for a significant gain. People prefer the certainty. There are lessons to be learned from this behaviour when marketers are designing incentive programs. How you pitch the opportunity makes a difference.
- People’s instinctive sense of randomness is … well, random. In flipping a coin 8 times, people will consider the sequence of flips T H T H H H T H random, whereas the flip results H H H H T T T T looks non-random and is therefore probably cheating. As well, if the first 7 flips were all Heads, virtually everyone would expect the next flip to be Tails, even though there is no more likelihood of it being T than H on the next flip. It’s always a 50/50 possibility, but this is simply not instinctive! Lewis’s book has many examples that illustrate how our “probability deficiency” sets us up for making poor choices, choices that are against our best interests.
- People typically worry far more about uncommon (less likely) threats than they worry about common (likely) threats. Statistics just don’t carry the same weight as instinct, even when people rationally understand that their instincts may be faulty. There are all kinds of examples; here is an example of current interest:
As the example above shows, public policy – in this particular case in the U.S., – is not always based on established facts. If it were, gun laws in the U.S. would be of more concern than refugees. But, as the work of Kahneman and Tversky has shown, what we may know and how we instinctively feel are two different things. And those feelings are real. The responsibility of our leaders is to recognize real and perceived concerns, provide the facts, identify options, work collaboratively towards compromise – and humane – solutions, and then explain those solutions to the public. We may not always agree on solutions, or even what the problems are, but we want to be able to count on our leaders not to misuse facts or raise false fears (or false hopes) in order to sway public opinion.
Photo credit: Huffington Post