Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
david_Introduction_to_behavioral_economics.pdf
Скачиваний:
50
Добавлен:
26.03.2016
Размер:
12.18 Mб
Скачать

 

 

 

 

Hindsight Bias and the Curse of Knowledge

 

303

 

if the knowledge is obvious or not. They must remember what knowledge was available to them at some prior point before the invention had been explained to them.

Gregory Mandel conducted a study using 247 new law students (none had taken any courses yet) who each got to play jury member in a patent law case. Participants were given background information based on jury instruction material from an actual case involving a new method for teaching how to throw different types of baseball pitches (e.g., a fast ball, curve ball, or slider). The materials described an inventor who had been asked to develop a method of teaching that allowed students to learn by holding a real baseball in their hand but did not require one-on-one instruction. Previous technologies included plastic replicas of baseballs with finger-shaped indentations where the fingers were to sit properly for each possible pitch, instructional videos, or cards illustrating the correct finger placement. The inventor proposed simply putting finger-shaped ink marks on real baseballs to illustrate the correct finger placement. In this way, the student could get his hands on a real baseball and make sure he had the proper hold for the pitch. This seems like a completely obvious idea. The technology required has existed as long as there have been baseballs and ink.

Students given this scenario were asked whether prior to the invention a solution to the problem (finding a method using real baseballs) would be obvious. In this case, the solution does seem fairly obvious. In fact, 76 percent of participants given the details of this patent case believed that the solution was entirely obvious. A second group of participants were given the same description of the request that was made of the inventor (produce a method to teach pitches with a real baseball), but were not told the solution. When asked if someone with average knowledge would see an obvious solution, only 24 percent believed they would. Why such a disparity? Once you know that there is such a simple and low-tech solution, it is hard to divorce yourself from that knowledge. In hindsight, the innovation is completely obvious. In foresight, it is a tricky puzzle that may be very difficult to solve. In this case, a jury given the entire case might throw out the patent even if it was not an obvious innovation simply because they are already familiar with the innovation.

Hindsight Bias and the Curse of Knowledge

People have extreme difculty not letting recent information bias their assessment of prior decisions. This inability to disregard hindsight information is called hindsight bias. The phenomenon of believing one had more knowledge than one truly did can lead to dubious claims. After the fact, a surprising number claim that they knew their team should have prepared for the other team to call a trick play, though very few openly predict the trick play in advance. Economically, hindsight bias can play a signicant role in stafng decisions. For example, an employee might propose a well-prepared and well- thought-out strategy that maximizes the expected returns of the strategy subject to some limit on the risks of negative returns given all the information that is available at the time. However, if the scenario that is subsequently realized involves substantial negative returns, a manager suffering from hindsight bias could claim that the outcome was obvious and he always knew that it was a bad idea. Such claims can be stiing in a work setting. Employees might begin to fear proposing anything innovative for fear they will

 

 

 

 

 

304

 

DISAGREEING WITH OURSELVES: PROJECTION AND HINDSIGHT BIASES

be held responsible for information that is not available at the time a decision must be made. Similarly, courts often nd accountants responsible for not anticipating poor outcomes that lead to businesses becoming insolvent. Much of the evidence suggests there is a heavy dose of hindsight bias in these court proceedings.

Hindsight bias is somewhat related to projection bias in that people are unable to project what their decision would be in a different state. However, hindsight bias does not deal with projecting preferences but beliefs. Thus, it should truly fall under the biases discussed in Chapter 7. However, projection bias may be a cause of hindsight bias. For example, a decision maker in a hot state might make decisions with bad consequences. If these consequences were foreseeable in the cold state, the decision maker might suppose the decision was poorly made. However, if the cold-state decision maker were placed in a hot state, he might readily make the decision again.

A close cousin of hindsight bias is the curse of knowledge. The curse of knowledge refers to the phenomenon of believing that others possess the same knowledge you do. The curse of knowledge is key to a whole class of economic problems often referred to as games of asymmetric information, in which one player has access to information that the other player or players cannot observe.

A classic example of asymmetric information is the purchase of a used car. The usedcar seller usually has much better information about the condition and reliability of the car than does the buyer. In modeling such games, economists usually assume that the person with private information can accurately assess how much information the other players have. In the case of the used car, a rational seller should be able to recognize that consumers dont know the reliability of the car, and thus the consumer will not be willing to pay very much for the vehicle. Because there is no way to independently verify that the car is reliable, the buyer would necessarily offer less money owing to the risk involved. In this case, if the car is reliable, and therefore valuable, the seller will be better off not selling the car because he could not recover the value. Alternatively, if the car is unreliable, and therefore worthless, the seller will sell the car and receive a low, but fair, price for it.

Suppose instead that the seller suffered from the curse of knowledge. In this case, he would assume the buyer can tell a reliable car from an unreliable car, increasing the price of the reliable car and decreasing the price of the unreliable car. If the buyer continues to be so uncertain of the quality that he would not buy a high-priced car, then the seller will only sell low-quality cars but will sell at a lower price than if he did not suffer from the curse of knowledge.

Colin Camerer, George Loewenstein, and Martin Weber found experimental evidence of the hindsight bias in a series of stock-trading experiments. Some participants were asked to predict stock market performance for several companies. Later, others were shown the actual performance of the companies over the predicted time and allowed to study them. Then, while the information was available to them, these participants were given the opportunity to buy or sell shares that would pay dividends based on the predictions made previously by uninformed participants. Trades substantially favored those stocks that performed unusually well in actuality rather than those that had been predicted to perform well. Such problems can lead insiders with private information to conduct illegal trades based on private information. Believing that outsiders have access to the same information can lead them to ignore the potential consequences of trading on insider informationincluding substantial jail time.

 

 

 

 

Hindsight Bias and the Curse of Knowledge

 

305

 

EXAMPLE 11.7 War in Hindsight

War evokes strong feelings from all parties involved (and often those uninvolved). This is perhaps natural. Consider a British campaign in 1814 against a group of Nepalese. One text1 describes the conflict this way:

For some years after the arrival of Hastings as governor-general of India, the consolidation of British power involved serious war. The first of these wars took place on the northern frontier of Bengal where the British were faced by the plundering raids of the Gurkhas of Nepal. Attempts had been made to stop the raids by an exchange of lands, but the Gurkhas would not give up their claims to country under British control, and Hastings decided to deal with them once and for all. The campaign began in November, 1814. It was not glorious. The Gurkhas were only some 12,000 strong; but they were brave fighters, fighting in territory well-suited to their raiding tactics. The older British commanders were used to war in the plains where the enemy ran away from a resolute attack. In the mountains of Nepal it was not easy even to find the enemy. The troops and transport animals suffered from the extremes of heat and cold, and the officers learned caution only after sharp reverses. Major-General Sir D. Octerlony was the one commander to escape from these minor defeats.

Given this history, would you guess the conflict resulted in

a.British victory?

b.Gurkha victory?

c.Military stalemate with no peace settlement?

d.Military stalemate with a peace settlement?

Baruch Fischhoff used this historical example in a psychology experiment involving 100 students at Hebrew University in Jerusalem. After the students read the passage, they were asked to assess the probability of each of the four possible outcomes before the beginning of the campaign. One fifth of the subjects were given no information about the outcome of the conflict before the probability-assessment exercise. The others were randomly told that one of the four outcomes had in reality happened. Table 11.2 displays the results of Fischhoff’s experiment. Note that without the simple statement as to what actually occurred, people assessed the probabilities of the events to be fairly even, with British victory and a stalemate with no peace settlement being slightly more probable. Alternatively, when participants were told one of the outcomes had actually occurred, they tended to assess that outcome to be more probable before the beginning of the conflict (this was the case for all but those told the outcome was a stalemate resulting in a peace settlement).

Military leaders fear setbacks in a war not only for the losses entailed directly; they also worry about public opinion. With substantial casualties or other setbacks comes the

1 Woodward, E.L. Age of Reform. London: Oxford University Press, 1938, pp. 383384.

 

 

 

 

 

 

 

 

 

 

306

 

DISAGREEING WITH OURSELVES: PROJECTION AND HINDSIGHT BIASES

 

 

 

 

Table 11.2 Assessments of the Probability of Outcomes

 

 

 

 

of the BritishGurka Struggle

 

 

 

 

 

 

 

 

 

 

 

 

 

Average Assessed Probability Given Participant Told

 

 

 

 

 

 

 

 

 

 

 

 

Potential

No

British

Gurkha

Stalemate, No

Stalemate with

 

 

 

Outcome

Information

Victory

Victory

Settlement

Settlement

 

 

 

 

 

 

 

 

 

 

 

 

British victory

0.338

0.572

0.303

0.257

0.330

 

 

 

Gurkha victory

0.213

0.143

0.384

0.170

0.158

 

 

 

Stalemate, no settlement

0.323

0.153

0.204

0.480

0.243

 

 

 

Stalemate with settlement

0.123

0.134

0.105

0.099

0.270

 

 

 

 

 

 

 

 

 

Source: Fischhoff, B. “Hindsight Foresight: The Effect of Outcome Knowledge on Judgment Under Uncertainty.” Journal of Experimental Psychology: Human Perception and Performance 1(1975): 288–299.

crowd that claims they should have known better. This has starkly been the case in nearly all modern conflicts. Even in the American Civil War, much of the antiwar movement in the North was galvanized by a string of Union losses in 1861 as Abraham Lincoln struggled to find a general he could work with.

History and Notes

The notion of utility originally had its foundation in the concept of emotion. Jeremy Bentham first proposed the concept of utility in the theory of decision making in the late 18th century. He classified emotions into 26 different categories: 12 that are painful and 14 that are pleasurable. He then considered that the best decision could be determined by calculating the net pleasure (pleasurable emotions minus painful emotions). This was the basis for his proposed cardinal measures of utility. By creating a cardinal (or intrinsic) measure of utility he hoped to find a way to make public policy decisions by a method of utility accounting. A cardinal measure of utility would allow us to know how many utils a particular policy would take from one person in order to make another person better off by so many utils. His particular method of accounting was problematic: People can value different emotions or objects differently. Thus, this emotion-based notion of utility was abandoned for the more-abstract notion of revealed preference. Revealed preference supposes that if a person chooses A when he could have chosen B, then he must obtain more utility from A than B. Revealed preference is an ordinal measure of utility and thus abandons the possibility of comparing utility tradeoffs across people. Revealed preference is the primary foundation for rational models of decision making. Ultimately, Bentham’s notion of cardinal utility led to modern welfare economics. Modern welfare economics sometimes assumes a social welfare function or a function that represents the aggregate well-being of all actors in an economy. More often, welfare analysis is conducted using a revealed preference approach and theories such as Pareto efficiency that do not rely on finding a cardinal measure of utility.

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]