By Count Friedrich von Olsen
I have a friend who is fond of saying, “I may not be right all the time, but I’m right more than I’m wrong.” Now, there’s an interesting notion. I will not pass judgment on my friend’s track record for accuracy. I am now given to wondering exactly what my batting average is in my decision making. No doubt, I’ve gotten some things right over the years. I think a few times, I might have hit the nail right on the head. But I also know this: I’ve made some calls that could not have been further off the mark than if I had deliberately tried to get it wrong. In fact, if I had tried to get it wrong in some situations, I would have gotten it more right than I did in trying to get it right…
So, what percentage of the time am I right? What percentage of the time am I mostly right? What percentage of the time am I more right than wrong? According to some research my butler, Hudson, did, I am not a good candidate to answer that question. Using some internet search device, he came up with hundreds of ways in which we humans delude ourselves into thinking we have done a straightforward analysis when, in fact, our examination is full of bias. There were so many of them, I cannot include them all in this column. But below are some of the more interesting ones along with the ones I recognized being guilty of myself. These are behavioral biases, social biases and what are called memory error biases. I pass them along, unvarnished, in the form Hudson provided them to me….
Anchoring or focalism: The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions…
Attentional bias: The tendency of our perception to be affected by our recurring thoughts…
Automation bias: The tendency to excessively depend on automated systems which can lead to erroneous automated information overriding correct decisions…
Availability cascade: A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse….
Backfire effect: When people react to disconfirming evidence by strengthening their beliefs….
Bandwagon effect: The tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior…
Belief bias: An effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion…
Bias blind spot: The tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases in others than in oneself…
Choice-supportive bias: The tendency to remember one’s choices as better than they actually were and, in a self-justifying manner, retroactively ascribing one’s choices to be more informed than they were when they were made…
Confirmation bias: The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.
Congruence bias: The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses…
Curse of knowledge: When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.
Denomination effect: The tendency to spend more money when it is denominated in small amounts (e.g. coins) rather than large amounts (e.g. bills)…
Disposition effect: The tendency to sell an asset that has accumulated in value and resist selling an asset that has declined in value…
Dunning-Kruger effect: The tendency for unskilled individuals to overestimate their ability and the tendency for experts to underestimate their ability…
Egocentric bias: Occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would credit them with…
Endowment effect: The tendency for people to demand much more to give up an object than they would be willing to pay to acquire it.
Experimenter’s or expectation bias: The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations…
Focusing effect: The tendency to place too much importance on one aspect of an event.
False consensus effect: The tendency for people to overestimate the degree to which others agree with them.
Forer effect or Barnum effect: The observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. This effect can provide a partial explanation for the widespread acceptance of some beliefs and practices, such as astrology, fortune telling, graphology, and some types of personality tests…
Framing effect: Drawing different conclusions from the same information, depending on how that information is presented…
Functional fixedness: Limits a person to using an object only in the way it is traditionally used…
Hindsight bias: Sometimes called the “I-knew-it-all-along” effect, the tendency to see past events as being predictable at the time those events happened…
Hot-hand fallacy: The fallacious belief that a person who has experienced success has a greater chance of further success in additional attempts…
Identifiable victim effect: The tendency to respond more strongly to a single identified person at risk than to a large group of people at risk…
IKEA effect: The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end result…
Information bias: The tendency to seek information even when it cannot affect action…
Illusion of asymmetric insight: People perceive their knowledge of their peers to surpass their peers’ knowledge of them…
Ingroup bias: The tendency for people to give preferential treatment to others they perceive to be members of their own groups….
Insensitivity to sample size: The tendency to under-expect variation in small samples…
Irrational escalation: The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong…
Just-world hypothesis: The tendency for people to want to believe that the world is fundamentally just, causing them to rationalize an otherwise inexplicable injustice as deserved by the victim(s)..
Loss aversion: ”The disutility of giving up an object is greater than the utility associated with acquiring it…”
Mere exposure effect: The tendency to express undue liking for things merely because of familiarity with them…
Money illusion: The tendency to concentrate on the nominal value (face value) of money rather than its value in terms of purchasing power…
Moral luck: The tendency for people to ascribe greater or lesser moral standing based on the outcome of an event…
Naïve realism: The belief that we see reality as it really is – objectively and without bias; that the facts are plain for all to see; that rational people will agree with us; and that those who don’t are either uninformed, lazy, irrational, or biased…
Negativity effect: The tendency of people, when evaluating the causes of the behaviors of a person they dislike, to attribute their positive behaviors to the environment and their negative behaviors to the person’s inherent nature…
Negativity bias: Psychological phenomenon by which humans have a greater recall of unpleasant memories compared with positive memories…
Normalcy bias: The refusal to plan for, or react to, a disaster which has never happened before…
Not invented here: Aversion to contact with or use of products, research, standards, or knowledge developed outside a group…
Observer-expectancy effect: When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it…
Omission bias: The tendency to judge harmful actions as worse, or less moral, than equally harmful omissions or inactions…
Pareidolia: A vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing non-existent hidden messages on records played in reverse.
Planning fallacy: The tendency to underestimate task-completion times…
Post-purchase rationalization: The tendency to persuade oneself through rational argument that a purchase was a good value….
Pro-innovation bias: The tendency to have an excessive optimism towards an invention or innovation’s usefulness throughout society, while often failing to identify its limitations and weaknesses….
Reactance: The urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
Reactive devaluation: Devaluing proposals only because they purportedly originated with an adversary.
Self-serving bias: The tendency to claim more responsibility for successes than failures…
Semmelweis reflex: The tendency to reject new evidence that contradicts a paradigm…
Social desirability bias: The tendency to over-report socially desirable characteristics or behaviors in one self and under-report socially undesirable characteristics or behaviors…
Shared information bias: Known as the tendency for group members to spend more time and energy discussing information that all members are already familiar and less time and energy discussing information that only some members are aware of…
Subjective validation: Perception that something is true if a subject’s belief demands it to be true. Also assigns perceived connections between coincidences…
System justification: The tendency to defend and bolster the status quo. Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged, sometimes even at the expense of individual and collective self-interest. (See also status quo bias.)
Weber–Fechner law: Difficulty in comparing small differences in large quantities…
Well travelled road effect: Underestimation of the duration taken to traverse oft-traveled routes and overestimation of the duration taken to traverse less familiar routes.
Zero-sum heuristic:: Intuitively judging a situation to be zero-sum i.e., that gains and losses are correlated.