|When -2 > +3|
|Sunday, 25 December 2011 00:00|
The fact that human beings are not 'rational' leads to strange equations, but understanding this is half way to being able to deal with it
Another title for this review could just a well have been ‘why reforms fail’, but that wouldn’t have been quite as catchy. And that wouldn’t have been fair to an author whose main point is that ‘framing’ is critical to the choices you make.
As a journalist, I understand that perfectly—the headlines and intros you give determine how your reader reads the piece! You can now understand why the Press Council chief, Justice Markandey Katju, feels the way he does!
Here’s a little exercise—my own, not Kahneman’s! How do you spell ‘toast’? That’s simple: toast. Huh, are you sure? Yup, TOAST. Don’t be in a hurry, think about it again—I’m talking about the thing you put butter on and eat; most breakfast menus in classy restaurants spell it correctly, why can’t you? Oh shut up, it is T-O-A-S-T and I’m sure! Okay, bright guy, if that’s right, how do you spell toaster? Toaster. Sure it isn’t toastar? Y-E-S it is T-O-A-S-T-E-R. So what do you put in a toaster? That’s simple: 99 out of 100 persons will say toast, not bread. You get the drift? Conditioning is everything.
Daniel Kahneman’s starting point is that the brain comprises an automatic/intuitive System 1 that takes decisions on the basis of what it sees and a slower/evaluative System 2 which though it’s supposed to take a larger look at issues so as to inform human decision-making, by and large tends to just rationalise System 1’s instinctive reaction. While that’s reason to worry, the good news if you’re trying to win friends and influence people is how you can mould System 1, someone else’s System 1, that is! Do you want to go for this surgery given that there is a 10% mortality in the first month itself—put that way, and the obvious answer is that you don’t want to go in for surgery. Put it another way—the one-month survival rate is 90%—and Kahneman’s volunteers mostly chose to go in for surgery. System 2 can figure out both are the same, but System 2 tends to be lazy.
Having shown this to be true through some imaginative exercises—even maths professors will fail at some of them—Kahneman goes on to show how traditional economic theory based on Humans and Econs breaks down, and that’s why he got the Nobel Prize in economics even though he’s a professor of psychology at Princeton.
Standard probability theory tells you that people add up probabilities to come to a decision. If you lose R100 if a coin shows tails and win R150 if there’s heads, will you take the gamble? Based on expected probability, you should, since you can lose R50 (R100 x 0.50) while you can win R75 (R100 x 0.50). But, Kahneman’s experiments showed, people don’t. Why? Because the fear of losing R100 is not enough to make up for the joy of possibly winning R150. Try it with some friends if you don’t believe it.
So here’s the stuff about reforms, about reorganising anything actually. The loss that the losers feel generally outweighs the pleasure the gainers feel. So the former are more vocal than the latter; that’s why the FDI in retail proposal fell through, because the kiranas felt hurt enough to generate a lot more support than the Walmarts felt the need to garner. Presumably that’s also the reason why firms hire lobbyists, or fund chambers of commerce—since their lives depend on their clients’ proposals going through, they just have to generate the necessary support. This is where standard economic theory—if there are more gainers (+3) than losers (-2), a proposal will go through as it is beneficial to society—fails.
Swiss scientist Daniel Bernoulli tried to resolve this by bringing in the law of diminishing marginal utility, which was a great step forward. In a standard utility function, the loss felt when your wealth comes down from R1 crore to R96 lakh is far less than the gain when your wealth rises from R1 lakh to R5 lakh—that’s why a poorer person is a lot more willing to pay for insurance, or buy a lottery ticket for that matter, and a rich man is happy to sell the insurance or the lottery ticket. But, Kahneman is right, this had one critical flaw. Bernoulli assumed that utility functions were the same for everyone. Harry Marcowitz, who later won a Nobel for this, refined the above by bringing in weights that were related to changes in wealth.
Based on a series of experiments, Kahneman comes up with his ‘fourfold pattern’ that is worth framing for any student of decision-making, from policymakers to advertising/marketing types (see box). A person with a 95% chance of winning R1 crore will tend to be risk averse—so if someone asks him to give a bribe to be 100% sure, he’ll give it (hear that Kaushik Basu?). If there’s a 95% chance of losing R1 crore, the person tends to be risk-seeking—he’ll go to court and file a suit instead of giving a bribe (presumably this explains Ajit Gulabchand filing a case against the government on Lavasa).
There are several more experiments that hold valuable lessons for all of us. Judges tended to give death sentences closer to lunch time and life sentences, for the same crime, a little after lunch—top bureaucrats I know swear their chances of success in getting policies through depend on which part of the insulin cycle their political bosses are in. If you’re making an important pitch, make it at lunch, or after it!
Get a person to concentrate on a computer screen and count the number of times odd numbers pop up, increase the speed of the numbers popping up and, after a while, let some gorillas flit on the screen—nine out of 10 persons don’t notice the gorillas as they’re so engrossed. Moral of the story, your mother probably drilled into you, start studying early in the morning!
Why do human beings go in for short-term solutions they know won’t work (think Angela Merkel whose ‘solutions’ don’t seem to convince the markets for more than a week at a time), why don’t they see tail-events (think of how less than half of overseas borrowings of India Inc are hedged)… Duration neglect, duration weighting, denominator neglect, cognitive illusion, anchoring effect, Thinking, Fast and Slow is full of such terms, with simple exercises to explain how they work and how you get around them. Read the book, both fast and slow, it’s well worth it.
|Last Updated ( Wednesday, 28 December 2011 06:21 )|