Ken Larsen's web site - Biases


What percentage of humans are unbiased? 

  • 25%?

  • 20%? 

  • 15%? 

  • 10%? 

  • 5%?

  • None

From what I've learned, the correct answer has to be None.

Here is a list of psychology terms which describe 191 different biases that humans are guilty of:

The probability that any of us has none of the 191 biases has got to be vanishingly small. 

The conclusion that I draw from this list is that future man might have to entrust computers with artificial intelligence to govern our world.  The 1963 Twilight Zone episode "The Old Man in the Cave" addressed this issue.  It was prescient. [actual episode of The Old Man in the Cave]

To explain my point further, below is a subset of the full list ... the ones I've observed happen in my home of Chapel Hill in my role as a political activist.


Law of the instrument

An over-reliance on a familiar tool or methods, ignoring or under-valuating alternative approaches.  “If all you have is a hammer, everything looks like a nail.”


Reactive devaluation

Devaluing proposals only because they purportedly originated with an adversary.


System justification

The tendency to defend and bolster the status quo.  Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged, sometimes even at the expense of individual and collective self-interest.  (See also status quo bias.)


Not invented here

Aversion to contact with or use, research, standards, or knowledge developed outside a group.  Related to IKEA effect.



 The psychological phenomenon that occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome.  Group members try to minimize conflict and reach a consensus decision without critical evaluation of alternative viewpoints by actively suppressing dissenting viewpoints, and by isolating themselves from outside influences.


Confirmation bias

 The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.


Courtesy bias

 The tendency to give an opinion that is more socially correct than one’s true opinion, so as to avoid offending anyone.


Bias blind spot

 The tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases in others than in oneself.


Neglect of probability

The tendency to completely disregard probability when making a decision under uncertainty.


Backfire effect

The reaction to disconfirming evidence by strengthening one’s previous beliefs.  Cf. influence effect.


Anchoring or focalism

 The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject).


Conservatism (belief revision)

The tendency to revise one’s belief insufficiently when presented with new evidence.


Naïve realism

The belief that we see reality as it really is – objectively and without bias; that the facts are plain for all to see; that rational people will agree with us; and that those who don’t are either uniformed, lazy, irrational, or biased.



Expecting a member of a group to have certain characteristics without having actual information about that individual.


Bandwagon effect

 The tendency to do (or believe) things because many other people do (or believe) the same.  Related to groupthink and herd behavior.


Ambiguity effect

 The tendency to avoid options for which the probability of a favorable outcome is unknown.


Mere exposure effect

The tendency to express undue liking for things merely because of familiarity with them.


Illusory superiority

Overestimating one’s desirable qualities, and underestimating undesirable qualities, relative to other people.  (Also known as “Lake Wobegon effect”, “better –than-average effect”, or “superiority bias”.)


Curse of knowledge

 When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.


Belief bias

 An effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion.


Choice-supportive bias

 The tendency to remember one’s choices as better than they actually were.


Clustering illusion

 The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, see phantom patterns).


Congruence bias

 The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses.


Hindsight bias

 Sometimes called the “I-knew it all along” effect, the tendency to see past events as being predictable at the time those events happened.


Irrational escalation or Escalation of commitment

The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong.  Also known as the sunk cost fallacy.


Moral luck

The tendency for people to ascribe greater or lesser moral standing based on the outcome of an event.


Identifiable victim effect

 The tendency to respond more strongly to a single identified person at risk than to a large group of people at risk.


Illusory correlation

Inaccurately perceiving a relationship between two unrelated events.


IKEA effect

 The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end product.


Ostrich effect

Ignoring an obvious (negative) situation.


Illusory truth effect

 A tendency to believe that a statement is true if it easier to process, or if it has been stated multiple, regardless of its actual veracity.  These are specific cases of truthiness.


Availability cascade

 A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse ( or “repeat something long enough and it will become true”).


Insensitivity to sample size

The tendency to under-expect variation in small samples.


Normalcy bias

The refusal to plan for, or react to, a disaster which has never happened before.


Continued influence effect

The tendency to believe previously learned misinformation even after it has been corrected.  Misinformation can still influence inferences one generates after a correction has occurred.  Cf. Backfire effect


Gambler's fallacy

 The tendency to think that future probabilities are altered by past events, when in reality they are unchanged.  The fallacy arises from an erroneous conceptualization of the law of large numbers.  For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much great than heads.”



The predisposition to view the past favorably (rosy retrospection) and future negatively.


Default effect

When given a choice between several options, the tendency to favor the default one.


Illusory truth effect

A tendency to believe that a statement is true if it is easier to process, or if it has been stated multiple times, regardless of its actual veracity.  These are specific case of truthiness.


Hot-hand fallacy

 The “hot-hand fallacy” (also known as the “hot hand phenomenon” or “hot hand”) is the belief that a person who has experienced success with a random event has a greater chance of further success in additional attempts.


Hyperbolic discounting

 Discounting is the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs.  Hyperbolic discounting leads to choices that are inconsistent over time – people make choices today that their future selves would prefer not to have made, despite using the same reasoning.  Also known as current moment bias, present-bias, and related to Dynamic inconsistency.  A good example of this:  a study showed that when making food choices for the coming week, 74% of participants chose fruit, whereas when the food choice was for the current day, 70% chose chocolate.


Forer effect or Barnum effect

 The observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people.  This effect can provide a partial explanation for the widespread acceptance of some beliefs and practices, such as astrology, fortune telling, graphology, and some types of personality tests.


Experimenter’s or expectation bias

 The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.


Framing effect

 Drawing different conclusions from the same information, depending on how that information is presented.


Identifiable victim effect

The tendency to respond more strongly to a single identified person at risk than to a large group of people at risk.


Negativity bias or Negativity effect

Psychological phenomenon by which humans have a greater recall of unpleasant memories compared with positive memories.  (see also actor-observer bias, group attribution error, positivity effect, and negativity effect)


Planning fallacy

The tendency to underestimate task-completion times.


Compassion fade

 The predisposition to behave more compassionate towards a small number of identifiable victims than to a large number of anonymous ones.


Ken Larsen's home page