Ken Larsen's web site - Biases

 

What percentage of humans are unbiased? 

  • 25%?

  • 20%? 

  • 15%? 

  • 10%? 

  • 5%?

  • None

From what I've learned, the correct answer has to be None.

Here is a list of psychology terms which describe 191 different biases that humans are guilty of:  https://en.wikipedia.org/wiki/List_of_cognitive_biases

The probability that any of us has none of the 191 biases has got to be vanishingly small. 

The conclusion that I draw from this list is that future man might have to entrust computers with artificial intelligence to govern our world.  The 1963 Twilight Zone episode "The Old Man in the Cave" addressed this issue.  It was prescient. [actual episode of The Old Man in the Cave]

To explain my point further, below is a subset of the full list ... the ones I've observed happen in my home of Chapel Hill in my role as a political activist.

1

Law of the instrument

An over-reliance on a familiar tool or methods, ignoring or under-valuating alternative approaches.  “If all you have is a hammer, everything looks like a nail.”

2

Reactive devaluation

Devaluing proposals only because they purportedly originated with an adversary.

3

System justification

The tendency to defend and bolster the status quo.  Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged, sometimes even at the expense of individual and collective self-interest.  (See also status quo bias.)

4

Not invented here

Aversion to contact with or use, research, standards, or knowledge developed outside a group.  Related to IKEA effect.

5

Groupthink

 The psychological phenomenon that occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome.  Group members try to minimize conflict and reach a consensus decision without critical evaluation of alternative viewpoints by actively suppressing dissenting viewpoints, and by isolating themselves from outside influences.

6

Confirmation bias

 The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.

7

Courtesy bias

 The tendency to give an opinion that is more socially correct than one’s true opinion, so as to avoid offending anyone.

8

Bias blind spot

 The tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases in others than in oneself.

9

Neglect of probability

The tendency to completely disregard probability when making a decision under uncertainty.

10

Backfire effect

The reaction to disconfirming evidence by strengthening one’s previous beliefs.  Cf. influence effect.

11

Anchoring or focalism

 The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject).

12

Conservatism (belief revision)

The tendency to revise one’s belief insufficiently when presented with new evidence.

13

Naïve realism

The belief that we see reality as it really is – objectively and without bias; that the facts are plain for all to see; that rational people will agree with us; and that those who don’t are either uniformed, lazy, irrational, or biased.

14

Stereotyping

Expecting a member of a group to have certain characteristics without having actual information about that individual.

15

Bandwagon effect

 The tendency to do (or believe) things because many other people do (or believe) the same.  Related to groupthink and herd behavior.

16

Ambiguity effect

 The tendency to avoid options for which the probability of a favorable outcome is unknown.

17

Mere exposure effect

The tendency to express undue liking for things merely because of familiarity with them.

18

Illusory superiority

Overestimating one’s desirable qualities, and underestimating undesirable qualities, relative to other people.  (Also known as “Lake Wobegon effect”, “better –than-average effect”, or “superiority bias”.)

19

Curse of knowledge

 When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.

20

Belief bias

 An effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion.

21

Choice-supportive bias

 The tendency to remember one’s choices as better than they actually were.

22

Clustering illusion

 The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, see phantom patterns).

23

Congruence bias

 The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses.

24

Hindsight bias

 Sometimes called the “I-knew it all along” effect, the tendency to see past events as being predictable at the time those events happened.

25

Irrational escalation or Escalation of commitment

The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong.  Also known as the sunk cost fallacy.

26

Moral luck

The tendency for people to ascribe greater or lesser moral standing based on the outcome of an event.

27

Identifiable victim effect

 The tendency to respond more strongly to a single identified person at risk than to a large group of people at risk.

28

Illusory correlation

Inaccurately perceiving a relationship between two unrelated events.

29

IKEA effect

 The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end product.

30

Ostrich effect

Ignoring an obvious (negative) situation.

31

Illusory truth effect

 A tendency to believe that a statement is true if it easier to process, or if it has been stated multiple, regardless of its actual veracity.  These are specific cases of truthiness.

32

Availability cascade

 A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse ( or “repeat something long enough and it will become true”).

33

Insensitivity to sample size

The tendency to under-expect variation in small samples.

34

Normalcy bias

The refusal to plan for, or react to, a disaster which has never happened before.

35

Continued influence effect

The tendency to believe previously learned misinformation even after it has been corrected.  Misinformation can still influence inferences one generates after a correction has occurred.  Cf. Backfire effect

36

Gambler's fallacy

 The tendency to think that future probabilities are altered by past events, when in reality they are unchanged.  The fallacy arises from an erroneous conceptualization of the law of large numbers.  For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much great than heads.”

37

Declinism

The predisposition to view the past favorably (rosy retrospection) and future negatively.

38

Default effect

When given a choice between several options, the tendency to favor the default one.

39

Illusory truth effect

A tendency to believe that a statement is true if it is easier to process, or if it has been stated multiple times, regardless of its actual veracity.  These are specific case of truthiness.

40

Hot-hand fallacy

 The “hot-hand fallacy” (also known as the “hot hand phenomenon” or “hot hand”) is the belief that a person who has experienced success with a random event has a greater chance of further success in additional attempts.

41

Hyperbolic discounting

 Discounting is the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs.  Hyperbolic discounting leads to choices that are inconsistent over time – people make choices today that their future selves would prefer not to have made, despite using the same reasoning.  Also known as current moment bias, present-bias, and related to Dynamic inconsistency.  A good example of this:  a study showed that when making food choices for the coming week, 74% of participants chose fruit, whereas when the food choice was for the current day, 70% chose chocolate.

42

Forer effect or Barnum effect

 The observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people.  This effect can provide a partial explanation for the widespread acceptance of some beliefs and practices, such as astrology, fortune telling, graphology, and some types of personality tests.

43

Experimenter’s or expectation bias

 The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.

44

Framing effect

 Drawing different conclusions from the same information, depending on how that information is presented.

45

Identifiable victim effect

The tendency to respond more strongly to a single identified person at risk than to a large group of people at risk.

46

Negativity bias or Negativity effect

Psychological phenomenon by which humans have a greater recall of unpleasant memories compared with positive memories.  (see also actor-observer bias, group attribution error, positivity effect, and negativity effect)

47

Planning fallacy

The tendency to underestimate task-completion times.

48

Compassion fade

 The predisposition to behave more compassionate towards a small number of identifiable victims than to a large number of anonymous ones.

 


Ken Larsen's home page