When we think of bias, we normally associate it with topics like racial prejudice, media organisations slanting their coverage to favour a point of view, etc. On the other hand, we all have it.

Cognitive bias
is a collection of faulty ways of thinking that is apparently hardwired into the human brain.

As our unconscious biases are all around us, the question is: can we overcome them?

Wikipedia lists 100+ different types; s
ome of these include

- confirmation bias (focus on evidence that confirms what we already think or suspect, to view facts and ideas that support our position and discount or ignore any piece of evidence that seems to support an alternative, eg party politics, where each side is unable to allow the others to be right in anything. Some examples

i) 2005 report to President Bush leading into the Iraq war

"...When confronted with evidence that indicated Iraq did not have weapons of mass destruction, analyst tended to discount such information. Rather than weighing the evidence independently, analysts accepted information that fit the prevailing theory and rejected information that contradicted it..."

The 2005 report to the President as quoted by Ben Yagoda, 2018

Similarly is UK Prime Minister Tony Blair and his government's position on the same Iraq war.
"...they highlighted information and placed the threat posed by Saddam in the worst possible light. They kept quiet about the relevant facts that damaged their position. They misinterpreted the actions and motives of those opposed to the war. And there was never any attempt to correct misleading statements after they had been utter..."
Tom Switzer, 2016

Even the most intelligent people can suffer confirmation bias. For example, Albert Einstein claimed that there were no black holes and the universe was not expanding. Both of these claims were found out later to be wrong. His cognitive bias, based on his earlier work, had blinded him to the possibility of black holes and the universe expanding.

- illusory superiority

This notion suggests that people with low ability at a task usually overestimate their ability to handle that task, ie people's inability to recognise that lack of ability as people cannot objectively evaluate their competence. This can be derived from people's ignorance of what is required. However training can give people a more accurate perception of how good they are at specific tasks

"...people with substantial, measurable deficits in their knowledge or expertise lack the ability to recognise these deficits......despite potentially making error after error, they tend to think they are performing competently when they are not..."
David Dunning and Justin Kruger as quoted Wikipedia, 2021b

Similarly, confident people can tend to underestimate their own competence as they incorrectly assume that tasks that are easy for them to perform are also easy for other people.
(source: )

- faulty heuristic bias (the shortcuts and rules of thumb by which we make judgements on predictions)

- availability heuristic bias (it relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method or decision. In this bias, we attribute value and importance to the most recent information available, irrespective of its worth. It operates on the notion that if something can be recalled, it must be important. People tend to weigh their judgements toward more recent information, ie making opinions based on the latest news. Some examples of this include people being surprised that suicides outnumber homicides; that drownings outnumber deaths by fire; people always think that crime is increasing, even if it's not)

- representativeness
heuristic bias (our strong desire to apply stereotypes)

- attribution error bias ( when assessing someone's behaviour, we put too much focus on his or her personal attributes and too little on external factors, many of which can be measured by statistics)

- endowment effect bias (which leads us to place an irrationally high value on our own possessions. This contradicts the classic economic theory which states that at any given time among a certain population, an item has a market value that doesn't depend on whether one owns it)

- sunken-cost bias (we stick with a bad decision like an investment owing to the money, time, resources, etc already invested, ie we want to recover what we have put into it. Some examples

i) continue in unwinnable wars, like USA in Iraq and Afghanistan, etc because of the investment of life, resources, etc. US President Donald Trump states

"...our nation must seek an honourable and enduring outcome worthy of the tremendous sacrifices that have been made, especially the sacrifices of lives..."

Donald Trump as quoted by Ben Yagoda, 2018

ii) finishing an unappetising restaurant meal because you are paying for it

- hyperbolic discounting bias (when considering a trade-off between two future moments, a greater weight is given to the one closer to the present)

- present bias (some examples

i) people under-save for retirement

NB  "...saving is like a choice between spending money today or giving it to a stranger years later..."

Hal Hershfield as quoted by Ben Yagoda, 2018

ii) people prefer less money now rather than more at a later date, eg take $150 now rather than $180 month later; yet the latter is a 20% return on investment

- actor-observer bias (tendency for explanations of other individuals' behaviours to overemphasise the influence of their personality and under emphasise the influence of the situation; while for explanations of our own behaviours the opposite is the case)

- Zelgarnik bias (uncompleted or interrupted tasks are remembered better than completed ones)

- IKEA bias (the tendency of people to place a disproportionately high value on objects that they have partially assembled themselves)

- gamblers' fallacy (if a tossed coin lands heads several times in a row, it is more likely to land tails on the next toss; but the odds are still 50-50 every time it is tossed)

- optimism bias (consistently underestimated costs and duration of projects we undertake)

- availability bias (as images of plane crashes are more vivid and dramatic in our memory and imagination, travelling by plane is regarded as more dangerous than travelling by car)

- anchoring bias (tendency to rely too much on the first piece of information offered, especially if it is presented in numeric estimates or predictions. This is the reason negotiators start with a number that is extremely too low or too high as they know the number will anchor the subsequent dealings)

- base-rate neglect bias (people's disinclination to believe statistics and other general evidence, basing their judgement instead on individual examples and vivid antidotes)

- blindspot bias (the feeling that one is less biased than the average person)

- protection bias ( the assumption that everyone else's thinking is the same as yours)

- conjunction fallacy bias (multiple specific conditions are more problematic then singled general ones)

- ambiguity bias (a tendency to avoid options for which missing information makes probability seem unknown)

- normalcy bias (the refusal to plan for, or react to, a disaster which has never happened before). Another way of looking at this bias is people tend to normalise unusual situations and continue their daily routines irrespective of the perceived dangers, ie
"...How people respond to an imminent crisis is how little they change their behaviours..."
Nick Petrie, 2020a

An example of this is the lack of people wearing face masks and/or practising social/physical distancing during the Covid - 19 pandemic, despite known benefits of doing these activities, ie reduce the chance of catching the virus which has the potential to kill.)

algorithm bias (people who develop these mathematical formuli have their own personal biases that could be reflected, consciously or unconsciously, in the algorithms)

- status quo bias (this is linked with self-preservation and complacency, ie happy with the way things despite the need to change)

- nepotism (favouritism)

Nepotism is another form of your cognitive bias.
"...every human being is hardwired by evolution to privilege his or her children. Everyone does it. We are not really designed by nature for meritocracy, we are designed by evolution to be nepotists. True meritocrats......won't lift up the phone to try and rig the system for their kids and they are letting their kids down. Everybody else is picking up the phone..."
Niall Ferguson as quoted by Kevin Chinnery, 2016

- algorithmical bias Algorithms are being used beyond conventional machine learning and statistical techniques (ordinary least squares, logistic regression, decision trees, etc) to real-world applications including medical diagnosis, judicial sentencing, professional recruiting, resource allocation, etc. There is some concern with this trend as
"...algorithms are often opaque, biased and unaccountable tools being welded in the interests of institutional power..."
Alex Miller, 2018

Research has shown that algorithms are not purely objective. However, we need to check how well algorithms compare with the status quo, ie humans. The evidence is
"...Algorithms are less biased and more accurate than humans they are replacing..."
Alex Miller, 2018

Some evidence

- in 2002 some economists studied the impact of automated underwriting algorithms in the mortgage lending industry. Their main finding was the algorithms more accurately predicted default than manual underwriters.
"...rather than marginalising traditionally underserved homebuyers, the algorithmic system actually benefited this segment of consumers the most..."
Alex Miller, 2018

- regarding the performance of job screening, the algorithm favoured non-traditional candidates more than the human screeners, ie
"...Compared with humans, the algorithm exhibited less bias against candidates that were underrepresented at the firm (such as those without personal referrals or degrees from prestigious universities)..."
Alex Miller, 2018

- in New York City pre-trial bail hearings, the algorithms have the potential to achieve significantly more equitable decisions than the judges who are currently made bail decisions

- algorithms can identify children that are in danger more accurately than humans, ie
"...rather than exacerbating the pernicious racial biases with some government services..."
 Alex Miller, 2018

- when choosing the best board members for publicly-traded companies, algorithms chose directors who gave superior performance to those chosen by humans. The latter tended to choose male directors with large networks, previous board experience and a financial background.

The above examples highlight
"...Credit applications, job screenings, criminal justice, public resource allocations and corporate governance - algorithms can reduce bias......research into judgement and decision making has demonstrated time and time again that humans are remarkably poor judges of quality in a wide range of contexts..."
Alex Miller, 2018

Replacing humans with algorithms can both increase efficiency and reduce institutional biases. Furthermore,
"...Algorithms deliver more-efficient and more-equitable outcomes..."
Alex Miller, 2018

Human beings are inconsistent, biased and phenomenally poor decision-makers when compared with algorithms.

Using technology can help solve some of the social ills of institutional bias and prejudicial discrimination. At the same time, we need to continually evaluate the ethical and social challenges of machine learning so that algorithms will have a positive impact on society.

- negativity bias

"...the human brain is designed to favour negative information and memories..."
Mary Widdicks 2019

The basis for this is so that we are more cautious in the future. It is called negativity bias.

"...research has found that human......recall negative emotions with a magnitude about five times that of positive emotions..."

Mary Widdicks 2019

This means to balance every negative experience we need 5 positive ones!!!

There is some evidence to suggest that negative and positive emotions are managed by different parts of the brain. This may encourage people to over-analyse negative experiences.

Creating positivity, ie overcoming negativity to boost positive feelings and increase overall happiness.

The aim to trick the brain into forming stronger associations with good feelings so that able to compete with negative experiences. It is about forming new habits. 

"...any type of behavioural change takes time and effort because we are essentially creating new pathways in the brain. The older or set in negative ways we are, the more difficult and uncomfortable it can be to make positive changes. But the human brain is easy to fool. Even If you simply go through the motions of celebrating, faking a smile or forcing a laugh, your brain will respond by releasing the same chemicals and producing the same beneficial effects over time as if those feelings were genuine. In other words, fake it until you make it..."

Mary Widdicks 2019

Some ways to increase positivity

- celebrate everything (including small accomplishments)

- be a positive role model (people learn through observing others' behaviour)

- have a "success wall" to record good things

- keep a diary of good things

- allocate part of each day to remember the good things in our lives

Status quo bias

This is linked with self-preservation and complacency, ie happy with the way things despite the need to change.

"...because the biases appear to be so hardwired and inalterable, most of the attention paid to counting them hasn't dealt with the problematic thoughts, judgements, or predictions themselves. Instead, it has been devoted to changing behaviour, in the form of incentives or nudges. For example, while present bias has so far proved intractable, employers have been able to nudge employees into contributing to retirement plans by making savings the default option: you had to actively take steps in order to not participate..."

Ben Yogoda 2018

This means that laziness or inertia can be more powerful than biases
. Thus procedures should be organised so that they dissuade or prevent people from acting on biases. We need to institute policies that include the monitoring of individual decisions and predictions like checklists and pre-mortems (an attempt to counter optimism bias by requiring team members to imagine that a project has gone wrong, ie write a couple of sentences to briefly describe what has happened. Conducting this exercise can help people think ahead like scenario planning.) Other possible ways include

- cue reasoning
(make the brain follow rules). On the other hand, it is very hard to change intuition and in the heat of the moment rules are not followed.

- using other people who can see our perceived areas more readily than we can

- training, like around statistics, to improve reasoning and decision-making so that all understand the law of large numbers, ie which states that the outlier results are more frequent when the sample size is small, regression to the mean, etc

"...Teaching people how to reason two or three domains are sufficient to improve people's reasoning for an indefinitely large number of events..."

Richard Nesbitt as quoted by Ben Yogoda 2018

If you are improving with testing/training, there is a good chance that you will do better in the real world.

Research into why some people are better at predicting the future has shown a concept called the outside view, ie

"...the inside view is a product of fundamental attribution error, base-rate neglect and other biases. They are constantly cajoling us into testing and judgements and predictions based on good and vivid stories instead of on data and statistics..."

Philip Tetlock as quoted by Ben Yogoda 2018

For example, when at a wedding you notice the mutual devotion shown by the bride and groom, you would most likely predict a happy marriage (insider's view). Yet you would be shocked to know that around 40% of marriages end in divorce (outsider's view)!!!!

Knowing other people bias

"...The conviction that we know others better than they know us - and that we may have insights about they may lack (but not vice versa) - leads us to talk when we would do better to listen and to be less patient then we ought to be when others express the conviction that they are the ones who are being misunderstood or judged unfairly. The same convictions can make us reluctant to take advice from others who cannot know our private thoughts, feelings, interpretations of events, or motives, but all too willing to give advice to others based on our views of their past behaviour, without adequate attention to their thoughts, feelings, interpretations and motives. Indeed, the biases documented......may create a barrier to the type of exchanges of information, and especially to the type of careful and respectful listening, that can go a long way to accentuating the feeling of frustration and resentment..."
Emily Pronin as quoted by Malcolm Gladwell 2019

"...the first set of mistakes we make with strangers - default to the truth and the illusion of transparency - has to do with our inability to make sense of a stranger as an individual......we add another......we do not understand the importance of context in which the stranger is operating..."
Malcolm Gladwell 2019

People are using video games, like Sirius, to help identify bias in thinking
. Sirius was developed by the US government agency (Intelligent Advance Research Projects Activity) after the debacle of the Iraqi war, ie catastrophic intelligence blunder around false claims of the Iraqis having weapons of mass destruction that led to the invasion of Iraqi. It focuses on 6 biases (confirmation, fundamental attribution, blind spot, anchoring, representativeness heuristic and projection). It is aiming to reduce bias such as:

- looking at 2 candidates for a position. One candidate has good references and experiences but is very nervous in the interview, while the other candidate is a better communicator, ie talks about topics that interest you but his performance record is mediocre. Will you overcome the fundamental attribution error and hire the first candidate?

- you are looking at employing someone you despise for reasons of temperament, behaviour and ideology. On the other hand, his performance is excellent. Will you dislodge your powerful confirmation bias and employ that person?

Cognitive bias can also be linked with the use of artificial intelligence (AI). For example, Google is perceived as a search engine which is neutral, ie the company marshalls computers and maths to objectively sift truth from trash. On the other hand, Google's activities are determined by humans who have opinions and blind spots, and have to work within an organisational structure with its corporate goals (financial, political, etc). As people use artificial intelligence tools that learn from real-world data, there is the potential to amplify the many biases found in society, even unbeknown to its creators. With Google there is concern

"...about one monopolistic platform controlling the information landscape..."

Safiya Noble as quoted by Farhad Manjoo, 2018

" the United States, 8 of 10 web searches are conducted through Google. Google also owns other major communication platforms, among them YouTube and Gmail, and it makes the android operating system and its App Store. It is the world's dominant Internet advertising company, and through their business, it also shapes market for digital news...... the important question is how it manages that power, and what checks we have on it......Google's influence on public discourse happens through priming algorithms, chief among them the search term term and which result you see in its search engine. These algorithms are secret......because explaining the precise way the algorithms work would leave them open to being manipulated...... this initial secrecy creates a troubling opacity. Because search engines take into account the time, place and some personalised factors when you search, the result you get today will not necessary match the results I get tomorrow. This makes it difficult for outsiders to investigate bias against Google's results..."

Farhad Manjoo, 2018

More on cognitive bias:

. Confirmation bias, ie confirming evidence, is linked with associated memory. For example, people prefer to seek data, information, etc that is compatible with their beliefs rather than those that will challenge them.

. Halo bias, ie the tendency to like, or dislike, everything about one a person or thing (this includes what has not necessarily been observed). This bias may result from a reaction to a person's voice or appearance. It is heavily reliant on first impressions and this can be determined by chance. The sequence matters as the first impression dominates the subsequent impressions. The gaps in your knowledge about the personal situation are filled in by guesses that fit one's emotional response to the personal situation. At the same time, as you become more familiar with the person or situation, evidence accumulates and attaches to your first impression. So sometimes a halo effect suppresses ambiguity and exacerbates the initial mis-impression.


. WYSIATI (What You See All There Is) - how the mind uses information that is currently available and unavailable. An essential design feature of associative machine is that it represents only activated ideas. Thus information that is not retrieved, even unconsciously, from the memory might as well not exist. Routine focus of the brain excels at constructing the best possible story that incorporates only ideas currently activated, irrespective of their accuracy and incomplete information. Routine focus of the brain

"...Is radically sensitive to both quality and quantity of information and gives rise to impressions..."

Daniel Kahneman 2012

. Research has shown (Daniel Kahneman 2012) that when people are given only one side of evidence, they are more confident in their judgments than those who understand both sides, ie they are convinced by the coherence of the story that they managed to construct from available information, ie

"...It is the consistency of the information that matters for a good story, not its completeness...... You will often find that knowing little makes it easier to fit everything you know into a coherent pattern..."

Daniel Kahneman 2012

"...WYSIATI facilitates the achievement of coherence and of cognitive ease that causes you to accept a statement as true. It explains why we can think fast and how we are able to make sense of partial information in a complex world. Much of the time, the coherent story put together is close enough to reality to support reason or help explain a long and diverse list of biases of judgement and choice..."

Daniel Kahneman 2012

. Some of these WYSATI habits include

- overconfidence (neither the quantity nor the quality of the evidence counts much in subjective confidence; the quality of the story is actually more important than the facts; consequently, critical evidence, doubts and ambiguity are often ignored)

- framing (different ways of presenting the same information often evoke different emotions, eg survival rate versus mortality rate from surgery. Survival rate puts a positive spin; while mortality puts a negative spin)

- base-rate neglect (stereo-typing, such as making personality generalisations, eg all doctors have the same personality, etc)

A cognitive bias test, ie Which line is shorter?



Most people will choose "A" but both lines are the same length!!!

There is a neurological basis for this illusion, ie we like to make quick decisions

Yet most of us assume that people will change their minds when being presented with impersonal debate, good thoughts, reasoned and rational argument plus research-based evidence, etc. It is assumed that this will balance out emotional attachments, beliefs, biases, etc. "...everywhere we look we see the gospel that reasoned argument is the currency of persuasion. And that the right way to change our mind is by entering a sort of gladiatorial contest of ideas where we leave the personal behind..." Eleanor Gordon-Smith as quoted by Elouise Fowler 2019

Yet this is not the case: most people are suspicious of so-called rational debate. It comes down to who you believe, not what you believe!!

An example of a strong cognitive bias that makes rational discussion on some topics very hard

"...over 50% of Americans believe that God created the universe less than 10,000 years ago..."

Zia Christi as quoted by Tony Boyd 2018b


Search For Answers

designed by: bluetinweb

We use cookies to provide you with a better service.
By continuing to use our site, you are agreeing to the use of cookies as set in our policy. I understand