Predictably Irrational

Image representing Dan Ariely as depicted in C...
Image via CrunchBase

We love to be told we’re smarter than we thought we were, but a surprise bestseller by an MIT professor has a less happy message: We’re consistently irrational much of the time. While there’s no cure, there’s hope – if we can learn to outsmart ourselves. Dan Ariely is an Israeli professor of behavioral economics. He teaches at Duke University and is head of the eRationality research group at the MIT Media Lab.

Despite our best efforts, bad or inexplicable decisions are as inevitable as death and taxes and the grocery store running out of your favorite flavor of ice cream. They’re also just as predictable. Why, for instance, are we convinced that “sizing up” at our favorite burger joint is a good idea, even when we’re not that hungry? Why are our phone lists cluttered with numbers we never call? Dan Ariely, behavioral economist, has based his career on figuring out the answers to these questions, and in his bestselling book Predictably Irrational (re-released in expanded form in May 2009), he describes many unorthodox and often downright odd experiments used in the quest to answer this question.

In his 2009 TED talk Dan explains why do we cheat. The experiments he conducted are amazingly simple but elegant. Here goes the Video:

From the transcript:

So, like we usually do, I decided to do a simple experiment. And here’s how it went. If you were in the experiment, I would pass you a sheet of paper with 20 simple math problems that everybody could solve, but I wouldn’t give you enough time. When the five minutes were over, I would say, “Pass me the sheets of paper, and I’ll pay you a dollar per question.” People did this. I would pay people four dollars for their task — on average people would solve four problems. Other people I would tempt to cheat. I would pass their sheet of paper. When the five minutes were over, I would say, “Please shred the piece of paper. Put the little pieces in your pocket or in your backpack, and tell me how many questions you got correctly.” People now solved seven questions on average. Now, it wasn’t as if there was a few bad apples — a few people cheated a lot. Instead, what we saw is a lot of people who cheat a little bit.

Now, in economic theory, cheating is a very simple cost-benefit analysis. You say, what’s the probability of being caught? How much do I stand to gain from cheating? And how much punishment would I get if I get caught? And you weigh these options out — you do the simple cost-benefit analysis, and you decide whether it’s worthwhile to commit the crime or not. So, we try to test this. For some people. we varied how much money they could get away with — how much money they could steal. We paid them 10 cents per correct question, 50 cents, a dollar, five dollars, 10 dollars per correct question.

You would expect that as the amount of money on the table increases, people would cheat more, but in fact it wasn’t the case. We got a lot of people cheating by stealing by a little bit. What about the probability of being caught? Some people shredded half the sheet of paper, so there was some evidence left. Some people shredded the whole sheet of paper. Some people shredded everything, went out of the room, and paid themselves from the bowl of money that had over 100 dollars. You would expect that as the probability of being caught goes down, people would cheat more, but again, this was not the case. Again, a lot of people cheated by just by a little bit, and they were insensitive to these economic incentives.

So we said, “If people are not sensitive to the economic rational theory explanations, to these forces, what could be going on?” And we thought maybe what is happening is that there are two forces. At one hand, we all want to look at ourselves in the mirror and feel good about ourselves, so we don’t want to cheat. On the other hand, we can cheat a little bit, and still feel good about ourselves. So, maybe what is happening is that there’s a level of cheating we can’t go over, but we can still benefit from cheating at a low degree, as long as it doesn’t change our impressions about ourselves. We call this like a personal fudge factor.

Now, how would you test a personal fudge factor? Initially we said, what can we do to shrink the fudge factor? So, we got people to the lab, and we said, “We have two tasks for you today.” First, we asked half the people to recall either 10 books they read in high school, or to recall The Ten Commandments, and then we tempted them with cheating. Turns out the people who tried to recall The Ten Commandments — and in our sample nobody could recall all of The Ten Commandments — but those people who tried to recall The Ten Commandments, given the opportunity to cheat, did not cheat at all. It wasn’t that the more religious people — the people who remembered more of the Commandments — cheated less, and the less religious people — the people who couldn’t remember almost any Commandments — cheated more. The moment people thought about trying to recall The Ten Commandments, they stopped cheating. In fact, even when we gave self-declared atheists the task of swearing on the Bible and we give them a chance to cheat, they didn’t cheat at all. Now, Ten Commandments is something that is hard to bring into the education system, so we said, “Why don’t we get people to sign the honor code?” So, we got people to sign, “I understand that this short survey falls under the MIT Honor Code.” Then they shredded it. No cheating whatsoever. And this is particularly interesting, because MIT doesn’t have an honor code.

So, all this was about decreasing the fudge factor. What about increasing the fudge factor? The first experiment — I walked around MIT and I distributed six-packs of Cokes in the refrigerators — these were common refrigerators for the undergrads. And I came back to measure what we technically call the half-lifetime of Coke — how long does it last in the refrigerators? As you can expect it doesn’t last very long. People take it. In contrast, I took a plate with six one-dollar bills, and I left those plates in the same refrigerators. No bill ever disappeared.

Now, this is not a good social science experiment, so to do it better I did the same experiment as I described to you before. A third of the people we passed the sheet, they gave it back to us. A third of the people we passed it to, they shredded it, they came to us and said, “Mr. Experimenter, I solved X problems. Give me X dollars.” A third of the people, when they finished shredding the piece of paper, they came to us and said, “Mr Experimenter, I solved X problems. Give me X tokens.” We did not pay them with dollars. We paid them with something else. And then they took the something else, they walked 12 feet to the side, and exchanged it for dollars.

Think about the following intuition. How bad would you feel about taking a pencil from work home, compared to how bad would you feel about taking 10 cents from a petty cash box? These things feel very differently. Would being a step removed from cash for a few seconds by being paid by token make a difference?

Over at EG, Entertainment Gathering Dan spoke about some other facets of predictable irrationality of human nature.

One more example of this. People believe that when we deal with physical attraction, we see somebody, and we know immediately whether we like them or not. Attracted or not. Which is why we have these four-minute dates. So I decided to do this experiment with people. I’ll show you graphic images of people — not real people. The experiment was with people. I showed some people a picture of Tom, and a picture of Jerry. I said “Who do you want to date? Tom or Jerry?” But for half the people I added an ugly version of Jerry. I took Photoshop and I made Jerry slightly less attractive. The other people, I added an ugly version of Tom. And the question was, will ugly Jerry and ugly Tom help their respective, more attractive brothers? The answer was absolutely yes. When ugly Jerry was around, Jerry was popular. When ugly Tom was around, Tom was popular.

This of course has two very clear implications for life in general. If you ever go bar hopping who do you want to take with you? You want a slightly uglier version of yourself. Similar. Similar … but slightly uglier. The second point, or course, is that if somebody else invites you, you know how they think about you.

Reblog this post [with Zemanta]

2 thoughts on “Predictably Irrational

  1. After reading about “Predictably..” in this blog I found the book and read it up. Amazing book. It helps one understand the world, and more importantly, his ownself. Many times I paused reading and thought in ashtonishment, “Okay! so this is why I did what I did.” Thanks to this blog for introducing the book to it’s readers (hope this encourages the bloggers to post again, they haven’t posted anything for a long time).
    There are two issues regarding the book though. The first is ethical. In a number of experiments the participants were given incomplete or misleading information about the experiment. What about the system of ‘informed consent’? Dan Arieli himself has mentioned about the ethical issue while talking about placebo, but that his personal conflict. It would be interesting to know the official standpoint of an institute like MIT where most of his experiments were conducted.
    The second issue is that of a sampling error. Arieli has performed his novel experiments with a very small sample size, mostly a handful of young MIT scholars, and yet he tries to generalise his results, imposing them over the whole human race. A researcher of his stature should have refrained from this sensationalism.
    Any way, the book is extremely well written and a good introduction to behavioural economics.

Leave a comment