- •Brief Contents
- •Contents
- •Preface
- •Who Should Use this Book
- •Philosophy
- •A Short Word on Experiments
- •Acknowledgments
- •Rational Choice Theory and Rational Modeling
- •Rationality and Demand Curves
- •Bounded Rationality and Model Types
- •References
- •Rational Choice with Fixed and Marginal Costs
- •Fixed versus Sunk Costs
- •The Sunk Cost Fallacy
- •Theory and Reactions to Sunk Cost
- •History and Notes
- •Rational Explanations for the Sunk Cost Fallacy
- •Transaction Utility and Flat-Rate Bias
- •Procedural Explanations for Flat-Rate Bias
- •Rational Explanations for Flat-Rate Bias
- •History and Notes
- •Theory and Reference-Dependent Preferences
- •Rational Choice with Income from Varying Sources
- •The Theory of Mental Accounting
- •Budgeting and Consumption Bundles
- •Accounts, Integrating, or Segregating
- •Payment Decoupling, Prepurchase, and Credit Card Purchases
- •Investments and Opening and Closing Accounts
- •Reference Points and Indifference Curves
- •Rational Choice, Temptation and Gifts versus Cash
- •Budgets, Accounts, Temptation, and Gifts
- •Rational Choice over Time
- •References
- •Rational Choice and Default Options
- •Rational Explanations of the Status Quo Bias
- •History and Notes
- •Reference Points, Indifference Curves, and the Consumer Problem
- •An Evolutionary Explanation for Loss Aversion
- •Rational Choice and Getting and Giving Up Goods
- •Loss Aversion and the Endowment Effect
- •Rational Explanations for the Endowment Effect
- •History and Notes
- •Thought Questions
- •Rational Bidding in Auctions
- •Procedural Explanations for Overbidding
- •Levels of Rationality
- •Bidding Heuristics and Transparency
- •Rational Bidding under Dutch and First-Price Auctions
- •History and Notes
- •Rational Prices in English, Dutch, and First-Price Auctions
- •Auction with Uncertainty
- •Rational Bidding under Uncertainty
- •History and Notes
- •References
- •Multiple Rational Choice with Certainty and Uncertainty
- •The Portfolio Problem
- •Narrow versus Broad Bracketing
- •Bracketing the Portfolio Problem
- •More than the Sum of Its Parts
- •The Utility Function and Risk Aversion
- •Bracketing and Variety
- •Rational Bracketing for Variety
- •Changing Preferences, Adding Up, and Choice Bracketing
- •Addiction and Melioration
- •Narrow Bracketing and Motivation
- •Behavioral Bracketing
- •History and Notes
- •Rational Explanations for Bracketing Behavior
- •Statistical Inference and Information
- •Calibration Exercises
- •Representativeness
- •Conjunction Bias
- •The Law of Small Numbers
- •Conservatism versus Representativeness
- •Availability Heuristic
- •Bias, Bigotry, and Availability
- •History and Notes
- •References
- •Rational Information Search
- •Risk Aversion and Production
- •Self-Serving Bias
- •Is Bad Information Bad?
- •History and Notes
- •Thought Questions
- •Rational Decision under Risk
- •Independence and Rational Decision under Risk
- •Allowing Violations of Independence
- •The Shape of Indifference Curves
- •Evidence on the Shape of Probability Weights
- •Probability Weights without Preferences for the Inferior
- •History and Notes
- •Thought Questions
- •Risk Aversion, Risk Loving, and Loss Aversion
- •Prospect Theory
- •Prospect Theory and Indifference Curves
- •Does Prospect Theory Solve the Whole Problem?
- •Prospect Theory and Risk Aversion in Small Gambles
- •History and Notes
- •References
- •The Standard Models of Intertemporal Choice
- •Making Decisions for Our Future Self
- •Projection Bias and Addiction
- •The Role of Emotions and Visceral Factors in Choice
- •Modeling the Hot–Cold Empathy Gap
- •Hindsight Bias and the Curse of Knowledge
- •History and Notes
- •Thought Questions
- •The Fully Additive Model
- •Discounting in Continuous Time
- •Why Would Discounting Be Stable?
- •Naïve Hyperbolic Discounting
- •Naïve Quasi-Hyperbolic Discounting
- •The Common Difference Effect
- •The Absolute Magnitude Effect
- •History and Notes
- •References
- •Rationality and the Possibility of Committing
- •Commitment under Time Inconsistency
- •Choosing When to Do It
- •Of Sophisticates and Naïfs
- •Uncommitting
- •History and Notes
- •Thought Questions
- •Rationality and Altruism
- •Public Goods Provision and Altruistic Behavior
- •History and Notes
- •Thought Questions
- •Inequity Aversion
- •Holding Firms Accountable in a Competitive Marketplace
- •Fairness
- •Kindness Functions
- •Psychological Games
- •History and Notes
- •References
- •Of Trust and Trustworthiness
- •Trust in the Marketplace
- •Trust and Distrust
- •Reciprocity
- •History and Notes
- •References
- •Glossary
- •Index
|
|
|
|
Hindsight Bias and the Curse of Knowledge |
|
303 |
|
if the knowledge is obvious or not. They must remember what knowledge was available to them at some prior point before the invention had been explained to them.
Gregory Mandel conducted a study using 247 new law students (none had taken any courses yet) who each got to play jury member in a patent law case. Participants were given background information based on jury instruction material from an actual case involving a new method for teaching how to throw different types of baseball pitches (e.g., a fast ball, curve ball, or slider). The materials described an inventor who had been asked to develop a method of teaching that allowed students to learn by holding a real baseball in their hand but did not require one-on-one instruction. Previous technologies included plastic replicas of baseballs with finger-shaped indentations where the fingers were to sit properly for each possible pitch, instructional videos, or cards illustrating the correct finger placement. The inventor proposed simply putting finger-shaped ink marks on real baseballs to illustrate the correct finger placement. In this way, the student could get his hands on a real baseball and make sure he had the proper hold for the pitch. This seems like a completely obvious idea. The technology required has existed as long as there have been baseballs and ink.
Students given this scenario were asked whether prior to the invention a solution to the problem (finding a method using real baseballs) would be obvious. In this case, the solution does seem fairly obvious. In fact, 76 percent of participants given the details of this patent case believed that the solution was entirely obvious. A second group of participants were given the same description of the request that was made of the inventor (produce a method to teach pitches with a real baseball), but were not told the solution. When asked if someone with average knowledge would see an obvious solution, only 24 percent believed they would. Why such a disparity? Once you know that there is such a simple and low-tech solution, it is hard to divorce yourself from that knowledge. In hindsight, the innovation is completely obvious. In foresight, it is a tricky puzzle that may be very difficult to solve. In this case, a jury given the entire case might throw out the patent even if it was not an obvious innovation simply because they are already familiar with the innovation.
Hindsight Bias and the Curse of Knowledge
People have extreme difficulty not letting recent information bias their assessment of prior decisions. This inability to disregard hindsight information is called hindsight bias. The phenomenon of believing one had more knowledge than one truly did can lead to dubious claims. After the fact, a surprising number claim that they knew their team should have prepared for the other team to call a trick play, though very few openly predict the trick play in advance. Economically, hindsight bias can play a significant role in staffing decisions. For example, an employee might propose a well-prepared and well- thought-out strategy that maximizes the expected returns of the strategy subject to some limit on the risks of negative returns given all the information that is available at the time. However, if the scenario that is subsequently realized involves substantial negative returns, a manager suffering from hindsight bias could claim that the outcome was obvious and he always knew that it was a bad idea. Such claims can be stifling in a work setting. Employees might begin to fear proposing anything innovative for fear they will
|
|
|
|
|
304 |
|
DISAGREEING WITH OURSELVES: PROJECTION AND HINDSIGHT BIASES |
be held responsible for information that is not available at the time a decision must be made. Similarly, courts often find accountants responsible for not anticipating poor outcomes that lead to businesses becoming insolvent. Much of the evidence suggests there is a heavy dose of hindsight bias in these court proceedings.
Hindsight bias is somewhat related to projection bias in that people are unable to project what their decision would be in a different state. However, hindsight bias does not deal with projecting preferences but beliefs. Thus, it should truly fall under the biases discussed in Chapter 7. However, projection bias may be a cause of hindsight bias. For example, a decision maker in a hot state might make decisions with bad consequences. If these consequences were foreseeable in the cold state, the decision maker might suppose the decision was poorly made. However, if the cold-state decision maker were placed in a hot state, he might readily make the decision again.
A close cousin of hindsight bias is the curse of knowledge. The curse of knowledge refers to the phenomenon of believing that others possess the same knowledge you do. The curse of knowledge is key to a whole class of economic problems often referred to as games of asymmetric information, in which one player has access to information that the other player or players cannot observe.
A classic example of asymmetric information is the purchase of a used car. The usedcar seller usually has much better information about the condition and reliability of the car than does the buyer. In modeling such games, economists usually assume that the person with private information can accurately assess how much information the other players have. In the case of the used car, a rational seller should be able to recognize that consumers don’t know the reliability of the car, and thus the consumer will not be willing to pay very much for the vehicle. Because there is no way to independently verify that the car is reliable, the buyer would necessarily offer less money owing to the risk involved. In this case, if the car is reliable, and therefore valuable, the seller will be better off not selling the car because he could not recover the value. Alternatively, if the car is unreliable, and therefore worthless, the seller will sell the car and receive a low, but fair, price for it.
Suppose instead that the seller suffered from the curse of knowledge. In this case, he would assume the buyer can tell a reliable car from an unreliable car, increasing the price of the reliable car and decreasing the price of the unreliable car. If the buyer continues to be so uncertain of the quality that he would not buy a high-priced car, then the seller will only sell low-quality cars but will sell at a lower price than if he did not suffer from the curse of knowledge.
Colin Camerer, George Loewenstein, and Martin Weber found experimental evidence of the hindsight bias in a series of stock-trading experiments. Some participants were asked to predict stock market performance for several companies. Later, others were shown the actual performance of the companies over the predicted time and allowed to study them. Then, while the information was available to them, these participants were given the opportunity to buy or sell shares that would pay dividends based on the predictions made previously by uninformed participants. Trades substantially favored those stocks that performed unusually well in actuality rather than those that had been predicted to perform well. Such problems can lead insiders with private information to conduct illegal trades based on private information. Believing that outsiders have access to the same information can lead them to ignore the potential consequences of trading on insider information—including substantial jail time.
|
|
|
|
Hindsight Bias and the Curse of Knowledge |
|
305 |
|
EXAMPLE 11.7 War in Hindsight
War evokes strong feelings from all parties involved (and often those uninvolved). This is perhaps natural. Consider a British campaign in 1814 against a group of Nepalese. One text1 describes the conflict this way:
For some years after the arrival of Hastings as governor-general of India, the consolidation of British power involved serious war. The first of these wars took place on the northern frontier of Bengal where the British were faced by the plundering raids of the Gurkhas of Nepal. Attempts had been made to stop the raids by an exchange of lands, but the Gurkhas would not give up their claims to country under British control, and Hastings decided to deal with them once and for all. The campaign began in November, 1814. It was not glorious. The Gurkhas were only some 12,000 strong; but they were brave fighters, fighting in territory well-suited to their raiding tactics. The older British commanders were used to war in the plains where the enemy ran away from a resolute attack. In the mountains of Nepal it was not easy even to find the enemy. The troops and transport animals suffered from the extremes of heat and cold, and the officers learned caution only after sharp reverses. Major-General Sir D. Octerlony was the one commander to escape from these minor defeats.
Given this history, would you guess the conflict resulted in
a.British victory?
b.Gurkha victory?
c.Military stalemate with no peace settlement?
d.Military stalemate with a peace settlement?
Baruch Fischhoff used this historical example in a psychology experiment involving 100 students at Hebrew University in Jerusalem. After the students read the passage, they were asked to assess the probability of each of the four possible outcomes before the beginning of the campaign. One fifth of the subjects were given no information about the outcome of the conflict before the probability-assessment exercise. The others were randomly told that one of the four outcomes had in reality happened. Table 11.2 displays the results of Fischhoff’s experiment. Note that without the simple statement as to what actually occurred, people assessed the probabilities of the events to be fairly even, with British victory and a stalemate with no peace settlement being slightly more probable. Alternatively, when participants were told one of the outcomes had actually occurred, they tended to assess that outcome to be more probable before the beginning of the conflict (this was the case for all but those told the outcome was a stalemate resulting in a peace settlement).
Military leaders fear setbacks in a war not only for the losses entailed directly; they also worry about public opinion. With substantial casualties or other setbacks comes the
1 Woodward, E.L. Age of Reform. London: Oxford University Press, 1938, pp. 383–384.
|
|
|
|
|
|
|
|
|
|
306 |
|
DISAGREEING WITH OURSELVES: PROJECTION AND HINDSIGHT BIASES |
|
||||
|
|
|
Table 11.2 Assessments of the Probability of Outcomes |
|
||||
|
|
|
of the British–Gurka Struggle |
|
|
|
|
|
|
|
|
|
|
||||
|
|
|
|
Average Assessed Probability Given Participant Told |
||||
|
|
|
|
|
|
|
|
|
|
|
|
Potential |
No |
British |
Gurkha |
Stalemate, No |
Stalemate with |
|
|
|
Outcome |
Information |
Victory |
Victory |
Settlement |
Settlement |
|
|
|
|
|
|
|
|
|
|
|
|
British victory |
0.338 |
0.572 |
0.303 |
0.257 |
0.330 |
|
|
|
Gurkha victory |
0.213 |
0.143 |
0.384 |
0.170 |
0.158 |
|
|
|
Stalemate, no settlement |
0.323 |
0.153 |
0.204 |
0.480 |
0.243 |
|
|
|
Stalemate with settlement |
0.123 |
0.134 |
0.105 |
0.099 |
0.270 |
|
|
|
|
|
|
|
|
|
Source: Fischhoff, B. “Hindsight Foresight: The Effect of Outcome Knowledge on Judgment Under Uncertainty.” Journal of Experimental Psychology: Human Perception and Performance 1(1975): 288–299.
crowd that claims they should have known better. This has starkly been the case in nearly all modern conflicts. Even in the American Civil War, much of the antiwar movement in the North was galvanized by a string of Union losses in 1861 as Abraham Lincoln struggled to find a general he could work with.
History and Notes
The notion of utility originally had its foundation in the concept of emotion. Jeremy Bentham first proposed the concept of utility in the theory of decision making in the late 18th century. He classified emotions into 26 different categories: 12 that are painful and 14 that are pleasurable. He then considered that the best decision could be determined by calculating the net pleasure (pleasurable emotions minus painful emotions). This was the basis for his proposed cardinal measures of utility. By creating a cardinal (or intrinsic) measure of utility he hoped to find a way to make public policy decisions by a method of utility accounting. A cardinal measure of utility would allow us to know how many utils a particular policy would take from one person in order to make another person better off by so many utils. His particular method of accounting was problematic: People can value different emotions or objects differently. Thus, this emotion-based notion of utility was abandoned for the more-abstract notion of revealed preference. Revealed preference supposes that if a person chooses A when he could have chosen B, then he must obtain more utility from A than B. Revealed preference is an ordinal measure of utility and thus abandons the possibility of comparing utility tradeoffs across people. Revealed preference is the primary foundation for rational models of decision making. Ultimately, Bentham’s notion of cardinal utility led to modern welfare economics. Modern welfare economics sometimes assumes a social welfare function or a function that represents the aggregate well-being of all actors in an economy. More often, welfare analysis is conducted using a revealed preference approach and theories such as Pareto efficiency that do not rely on finding a cardinal measure of utility.