•
Traditionally, people are expected to maximise
their utility under several conditions.
–
First, they should be well informed about their choice.
–
Second, they should be fully aware of the
consequences of the choice they have made.
–
And finally, in determining such a choice, they should
pursue their wishes in a way that is logically consistent
(for example, if A is preferred than B, and B is
bounded rationality, bounded
willpower, and bounded self-interest
•
Bounded rationality occurs because people often
use heuristic to make their judgements, which,
although it helps people to make fast decisions,
leads to errors in some circumstances.
•
bounded willpower occurs because people often
make decisions that they know to be in conflict
with their long-term interests.
•
people may disregard the probability of the
future events when they have to make a
judgement about such events.
–
Hence, in the face of uncertainty, people tend to
base their judgement on the “rules of thumb”,
referred to as “heuristics”, namely a shortcut that
helps people to make a decision when information
is incomplete or uncertain.
–
Although much of the time such a shortcut could
provide roughly correct answers, it also often
Availability heuristics
•
“
retrievability
of instances”, namely that familiar and salient
instances are easier to retrieve than unfamiliar and less salient
ones. Biases occur since instances that are faster and easier to
recall from memory will appear to be more numerous than
instances that are less retrievable
•
the imaginability of events. The occurrence of events will appear
to be more likely when it can be easily imagined.
– Therefore, it is argued that activities whose risks can easily be
portrayed will appear to be more dangerous than activities whose risks are difficult to imagine
•
“illusory correlation” factor may contribute to the availability
heuristic. With such a bias, one considers two things would occur
simultaneously, although they will not, because they are seen as
associates.
– When people think two events are closely associated, they expect that one event will take place as a consequence of the occurrence of
“representativeness”
•
people consider an instance as a representative
of a population based on the similarity of this
instance to the population.
–
“insensitivity to prior probability of outcomes”.
•
For example, people tend to assess that a person belongs to
a certain occupation, for instance an engineer, rather than
others, say a lawyer, because the similarity of this person to
a stereotype description is considered as representing this
occupation. In this case, information about probability,
namely that there are more lawyers than engineers, does
not significantly alter the people’s judgement
–
“gambler’s fallacy”, the gambler feels that the fairness
–
“insensitivity to predictability”
•
People use information that is actually not a good
predictor, without so much wonder about the accuracy
and relevance of this information to their prediction.
– Tversky and Kahneman give an empirical example where subjects were presented with several paragraphs describing the performance of a teacher during a particular period of teaching. At the end, subjects were asked to evaluate the
teacher’s performance based on the provided description. They were also asked to predict the teacher’s career for the
“adjustment and anchoring”
•
if people are given a value and then are asked to give their
judgement, they often adjust their judgements to this initial
value (referred to as “anchor”), even when this value is
irrelevant to their judgement. Shortly, people’s judgements
are biased toward their initial values, which are used as
starting point for their judgement.
–
Tversky and Kahneman present an experiment where
subjects were asked to estimate the number of African
countries in the UN. A number between 0 and 100 was
determined randomly by spinning a wheel of fortune.
Subjects were first asked to estimate whether the number
of the countries was higher or lower than the randomly
chosen number (the “anchor”). This experiment explained
Prospect Theory, Endowment Effect,
and Status Quo Bias
•
Respondents were asked to imagine that the U.S
was threatened by an unusual Asian disease,
which was expected to kill 600. A choice had to
be made between two alternative programs with
different consequences as follows:
–
If Program A is adopted, 200 people will be saved.
–
If Program B is adopted, there is 1/3 probability that
600 will be saved, and 2/3 probability that no people
will be saved.
•
A second group of respondents was given the
same story with the following consequences:
–
If Program C is adopted, 400 people will die.
–
If program D is adopted there is 1/3 probability
that nobody will die, and 2/3 probability that 600
people will die.
•
Kahneman and Tversky conclude that
respondents tended to be risk averse in “lives
saved” version and risk
-
seeking in “lives lost”
version.
–
It means that the frame of the outcomes will
determine the people’s preference.
•
If the outcomes are framed in term of gains (such as
the number of lives saved), people would be risk
averse, and prefer the prospect that offers more certain
outcomes.
•
Kahneman and Tversky
also argue that people’s evaluations
are influenced by the way they weight the probability,
referred to as “certainty effect”, a phenomenon indicates
that people overweight outcomes that are considered
certain, relative to outcomes that are merely “probable”
–
80% of the respondents preferred a certain win of $ 3000
rather than a 0.80 chance to win $4000, although the latter has
a higher expected utility
–
if the probabilities to win are “possible but not probable” (e.g. a
0.001 chance to win $ 6000 and a 0.002 chance to win $ 3000),
most people will chose the prospect that offers the larger gain
(namely a 0.001 chance to win $ 6000)
•
loss aversion
, namely that the response to losses
is more extreme than the response to gains.
–
changes that make things worse (such as losses) loom
larger than improvements or gains
•
An important implication of loss aversion is the
“status quo bias”.
–
people’s strong tendency to remain at the status quo
because the disadvantages of leaving it loom larger
than advantages.
•
an alternative becomes significantly more popular when it is
designated as the status quo.
•
people may have an exaggerated preference for the option
set as the default choice
–
Examples: Organ donors in USA vs. Europe; insurance
schemes
•
“Endowment effect”:
people ascribe more value to
things merely because they own them
•
In one experiment, Kahneman, Knetsch, and Thaler
asked students to trade among them. The objects of
trade were “induced value tokens”
, i.e. mugs. Half of
the students were made the owners of mugs (students
knew that a similar mug was sold at price of $6.00),
and the other half were not, so that the supplies and
demands for tokens were created..
•
After several trials, the median owner was unwilling to
sell for less than $5.25, while the median buyer was
unwilling to pay more than $2.25-$2.75. This
Public Perception of Risk v. Experts
Opinion
•
Risk perception or risk attitude
•
Biases:
–
Status quo
–
Availability Heuristics
–
The Mythical Benevolence of Nature
–
Probability neglect
–
System neglect (Sunstein)
+ Representative bias, anchoring, etc
•
Breyer: compared to experts, public’s
reactions to risks reflect different
understanding about the underlying
risk-related facts, such as
–
Rule of thumbs (heuristics)
–
Prominence
–
Ethics, the strength of which will diminish with
distance
–
Mathematics:
•
Framing
• Slovic: admits some problems in public perception of risk
– Heuristics
– Amplification of risk (the ripple effect)
– Fixed belief
• However, there are Qualitative Attributes to Risk (Psychometric)
– Slovic: Risk perception is influenced by:
• Perceived “severity” of risk, consisting of 12 characteristics,
namely:
– not controllable, dread, globally catastrophic, hard to prevent, certain to be fatal, risks and benefits inequitable, catastrophic, threatens future generations, not easily reduced, risks
increasing, involuntary, and affects the respondent personally.
• Perceived familiarity : not observable, unknown to those exposed, effects immediate, new (unfamiliar), unknown to science.
• Perceived number of people exposed to the risk.
• Fischhoff, et al: show that the level of acceptable risk is not only a function
of benefit and voluntariness, but also of other characteristics such as perceived control, familiarity, knowledge, and immediacy.