Week 1
Judgments: attaching a value to something; estimates, evaluations and opinions.
Decisions: choosing between alternatives.
3 types of models:
1. Normative models, how people should be making judgments and decisions. A rational
process. Use available information in an optimal way to maximize the expected net outcome.
Bazerman & Moore state a six-step rational decision-making process: 1) define the problem,
2) identify the criteria, 3) weigh the criteria, 4) generate alternatives, 5) rate each alternative
on each criterion and 6) compute the optimal decision.
2. Descriptive models, how people actually make judgments and decisions
3. Prescriptive models, practical suggestions on designing judgment and decision-making
processes based on normative and descriptive models.
Predictably irrational, people do not make decisions normatively, but we deviate from the normative
framework in a systematic way.
We are well prepared to make fast and simple decisions in a stable, predictable environment with
relatively few homogeneous actors and to spend no more energy than strictly necessary. We are not
well prepared to make complex economic decisions in dynamic, unpredictable environments with
many heterogeneous factors. When we face such decisions, we cannot simply ignore our human
nature and follow a rational process.
Dual process theory:
- System 1, our intuitive, fast, automatic, emotional, effortless, systematic decision making.
It’s employed when we interpret verbal language, process visual information and talk, etc.
- System 2, is slow, effortful, conscious and explicit. You know you are thinking, you are
deliberately making a decision (consciously monitored, doubt).
Both systems are always on, System 2 typically in a comfortable low effort mode. System 1
continuously provides System 2 with impressions, intuitions and feelings. System 2 is only mobilized
when System 1 runs into difficulties. The division of labor is highly efficient.
System 1 isn’t designed for complete decision making and tends to make decisions that are
suboptimal. System 2 will often do a better job, but: (1) System 2 relies on System 1 for input, (2)
System 1 cannot be switched off and (3) System 2 may not be employed frequently enough.
In sum, even if we try to be rational, we are unlikely to succeed. Our judgments and decisions will
deviate from normative benchmarks.
Central notion: accessibility, how easily something comes to mind. Perceptions and intuitions come
to mind effortlessly.
Our brain is easily satisfied, it tends to jump to conclusions. The easily accessible perceptions,
impressions and intuitions from System 1 have a disproportionate weight on our judgments and
decisions. Cognitive ease feels good. Doubt and uncertainty are difficult and often feel bad.
Your brain tends to take immediate action to avoid cognitive dissonance and to restore your familiar
and coherent notion of the world (and of yourself as a person). You might not even become aware of
information that, from a normative perspective, requires you to change your mind.
,Determinants of accessibility: (a) Perceptual salience, (b) Surprisingness, (c) Familiarity and (d) When
things are: 1) associated with emotion, 2) associated with potential losses and 3) in line with our
current ‘mindset’.
Instead of answering a difficult question (using System 2), our first impulse is to find an easier
accessible answer to a related question using our System 1. We substitute a difficult question for an
easier question. A.k.a. heuristics.
Heuristics, relatively simple procedures that help find adequate, though often imperfect, answers to
difficult questions. 4 main heuristics:
1. Availability, when things that come to mind easier are judged to be more likely, frequent,
probable.
2. Representativeness, (probability) judgments are influenced by how typical something is for
its ‘category’. E.g. a business idea is more likely to be accepted if the pitcher looks like Jeff
Bezos, so that the association with him will make that the business idea is associated with
Bezos’ success on Amazon.
3. Confirmation, judgments are influenced by what we expect. We interpret new information
as confirming our existing beliefs.
4. Affect, judgments are affected by our emotional state or mood.
Normative models, Baron (2004): any normative model needs to start from the simple idea that
some outcomes are better than others. No claim about absolute truth, but ‘truth relative to
assumptions’. Normative models arise through the ‘imposition of an analytic scheme’.
Utility, whatever is maximized (‘good’, ‘goodness’). Normative models don’t tell what should be
maximized, it tells what should be done to maximize whatever it is that is trying to be maximized.
Humans are assumed to have ‘utility functions’, things contribute positively or negatively to their
utility (e.g. agency theory: positive utility from wealth, negative from effort).
Two core assumptions of utility:
1. Transitivity: If A > B and B > C, than A > C.
2. Connectedness: it’s either the case that A > B, or that A < B, or that A = B. So, it’s possible to
compare A and B.
Expected-utility theory (EUT), individuals should choose the option with the highest expected utility.
In absence of uncertainty this is straightforward, but when there is uncertainty, you should calculate
which option has the highest expected utility.
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑢𝑡𝑖𝑙𝑖𝑡𝑦 = 𝑈𝑡𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑜𝑢𝑡𝑐𝑜𝑚𝑒 ∗ 𝑃𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑜𝑢𝑡𝑐𝑜𝑚𝑒
In a firm, employees should always take the action that makes the highest expected net present
value of all future cash flows. However, you can’t be certain about which cash flow will result from an
action. Decisions must be based upon probabilities.
A probability according to Savage (1954) must be:
- Logical necessary, a fair dice must land on one of its six sides.
- Objective, relative frequencies, a fair dice thrown a million time will land equally frequently
on each of its sides.
- Personal, what will happen in my specific unique situation.
, For most real-world decisions we need to rely on personal probability. Objective probability is helpful
in determining personal probability, but not more than that: Your specific situation is unique in ways
that are impossible to know. You face a one-off decision, not an infinite number of decisions.
If it’s coherent, personal probability can still play an important role in normative models of judgment
and decision making.
Coherent personal probabilities:
- Additivity:
▪ If A and B are mutually exclusive, then p(A) + p(B) = p(A or B). (e.g. the probability of
a dice landing on 5 or 6 = 1/6 + 1/6 = 1/3).
▪ The probability of a statement being true and the probability of that same statement
being false are mutually exclusive and add up to one: p(A) +p(not A) = 1
- Multiplication:
▪ If A and B are both true p(A & B) = p(A|B) * p(B)
▪ Two statements A and B are independent if knowing about the truth of one doesn’t
tell you anything about the truth of the other: p(A|B) = p(A), also p(A & B) = p(A) *
p(B)
Bayes theorem, the additivity and multiplication rules together imply:
The theorem provides a normative model for how we should update our beliefs
(personal probabilities) in the presence of new information. It tells us: the
probability of A given B. When we know: p(B|A), the p(A) and the p(B).
To conclude, normative decision making lists all possible courses of action. Choose the one with the
highest expected utility. If new information becomes available, update your beliefs using Bayes’
theorem and repeat.
Week 2
A naïve utility function (when people are indifferent) →
However, people are not always naïve. People attach value to things in 4 ways:
1. Risk aversion and ambiguity aversion.
Risk aversion, valuing a sure thing higher than a risky thing, even
though the expected model value of the two is the same. A common
assumption in classic economic models. The level of risk-aversion
varies between individuals. Firm owners are generally assumed to be
risk neutral as they can diversify their portfolio.
The Expected Utility Theory (EUT) utility function: Utility increases
with wealth, but at a decreasing rate (concave) → implies risk
aversion.
Utility increases with wealth, but at an increasing rate (convex) →
implies risk loving.