Bias

When we think of bias, we normally associate it with topics like racial prejudice, media organisations slanting their coverage to favour a point of view, etc. On the other hand, we all have it.

Cognitive bias
is a collection of faulty ways of thinking that is apparently hardwired into the human brain.


As our unconscious biases are all around us, the question is: can we overcome them?


Wikipedia lists 100+ different types; s
ome of these include

- confirmation bias (focus on evidence that confirms what we already think or suspect, to view facts and ideas that support our position and discount or ignore any piece of evidence that seems to support an alternative, eg party politics, where each side is unable to allow the others to be right in anything. Some examples

i) 2005 report to President Bush leading into the Iraq war

"...When confronted with evidence that indicated Iraq did not have weapons of mass destruction, analyst tended to discount such information. Rather than weighing the evidence independently, analysts accepted information that fit the prevailing theory and rejected information that contradicted it..."


The 2005 report to the President as quoted by Ben Yagoda, 2018


- faulty heuristic bias (the shortcuts and rules of thumb by which we make judgements on predictions)


- availability heuristic bias (it relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method or decision. It operates on the notion that if something can be recalled, it must be important. People tend to weigh their judgements toward more recent information, ie making opinions based on the latest news. Some examples of this include people being surprised that suicides outnumber homicides; that drownings outnumber deaths by fire; people always think that crime is increasing, even if it's not)


- representativeness
heuristic bias (our strong desire to apply stereotypes)

- attribution error bias ( when assessing someone's behaviour, we put too much focus on his or her personal attributes and too little on external factors, many of which can be measured by statistics)


- endowment effect bias (which leads us to place an irrationally high value on our own possessions. This contradicts the classic economic theory which states that at any given time among a certain population, an item has a market value that doesn't depend on whether one owns it)


- sunken-cost bias (we stick with a bad decision like an investment owing to the money, time, resources, etc already invested, ie we want to recover what we have put into it. Some examples

i) continue in unwinnable wars, like USA in Iraq and Afghanistan, etc because of the investment of life, resources, etc. US President Donald Trump states

"...our nation must seek an honourable and enduring outcome worthy of the tremendous sacrifices that have been made, especially the sacrifices of lives..."


Donald Trump as quoted by Ben Yagoda, 2018

ii) finishing an unappetising restaurant meal because you are paying for it


- hyperbolic discounting bias (when considering a trade-off between two future moments, a greater weight is given to the one closer to the present)

- present bias (some examples

i) people under-save for retirement

NB  "...saving is like a choice between spending money today or giving it to a stranger years later..."

Hal Hershfield as quoted by Ben Yagoda, 2018


ii) people prefer less money now rather than more at a later date, eg take $150 now rather than $180 month later; yet the latter is a 20% return on investment

- actor-observer bias (tendency for explanations of other individuals' behaviours to overemphasise the influence of their personality and under emphasise the influence of the situation; while for explanations of our own behaviours the opposite is the case)

- Zelgarnik bias (uncompleted or interrupted tasks are remembered better than completed ones)

- IKEA bias (the tendency of people to place a disproportionately high value on objects that they have partially assembled themselves)

- gamblers' fallacy (if a tossed coin lands heads several times in a row, it is more likely to land tails on the next toss; but the odds are still 50-50 every time it is tossed)

- optimism bias (consistently underestimated costs and duration of projects we undertake)

- availability bias (as images of plane crashes are more vivid and dramatic in our memory and imagination, travelling by plane is regarded as more dangerous than travelling by car)

- anchoring bias (tendency to rely too much on the first piece of information offered, especially if it is presented in numeric estimates or predictions. This is the reason negotiators start with a number that is extremely too low or too high as they know the number will anchor the subsequent dealings)

- base-rate neglect bias (people's disinclination to believe statistics and other general evidence, basing their judgement instead on individual examples and vivid antidotes)

- blindspot bias (the feeling that one is less biased than the average person)

- protection bias ( the assumption that everyone else's thinking is the same as yours)

- conjunction fallacy bias (multiple specific conditions are more problematic then singled general ones)

- ambiguity bias (a tendency to avoid options for which missing information makes probability seem unknown)

- normalcy bias (the refusal to plan for, or react to, a disaster which has never happened before)

-
algorithm bias (people who develop these mathematical formuli have their own personal biases that could be reflected, consciously or unconsciously, in the algorithms)

"...because the biases appear to be so hardwired and inalterable, most of the attention paid to counting them hasn't dealt with the problematic thoughts, judgements, or predictions themselves. Instead, it has been devoted to changing behaviour, in the form of incentives or nudges. For example, while present bias has so far proved intractable, employers have been able to nudge employees into contributing to retirement plans by making savings the default option: you had to actively take steps in order to not participate..."


Ben Yogoda 2018

This means that laziness or inertia can be more powerful than biases
. Thus procedures should be organised so that they dissuade or prevent people from acting on biases. We need to institute policies that include the monitoring of individual decisions and predictions like checklists and pre-mortems (an attempt to counter optimism bias by requiring team members to imagine that a project has gone wrong, ie write a couple of sentences to briefly describe what has happened. Conducting this exercise can help people think ahead like scenario planning.) Other possible ways include

- cue reasoning
(make the brain follow rules). On the other hand, it is very hard to change intuition and in the heat of the moment rules are not followed.

- using other people who can see our perceived areas more readily than we can

- training, like around statistics, to improve reasoning and decision-making so that all understand the law of large numbers, ie which states that the outlier results are more frequent when the sample size is small, regression to the mean, etc

"...Teaching people how to reason statistically......in two or three domains are sufficient to improve people's reasoning for an indefinitely large number of events..."


Richard Nesbitt as quoted by Ben Yogoda 2018

If you are improving with testing/training, there is a good chance that you will do better in the real world.

Research into why some people are better at predicting the future has shown a concept called the outside view, ie

"...the inside view is a product of fundamental attribution error, base-rate neglect and other biases. They are constantly cajoling us into testing and judgements and predictions based on good and vivid stories instead of on data and statistics..."


Philip Tetlock as quoted by Ben Yogoda 2018

For example, when at a wedding you notice the mutual devotion shown by the bride and groom, you would most likely predict a happy marriage (insider's view). Yet you would be shocked to know that around 40% of marriages end in divorce (outsider's view)!!!!

People are using video games, like Sirius, to help identify bias in thinking
. Sirius was developed by the US government agency (Intelligent Advance Research Projects Activity) after the debacle of the Iraqi war, ie catastrophic intelligence blunder around false claims of the Iraqis having weapons of mass destruction that led to the invasion of Iraqi. It focuses on 6 biases (confirmation, fundamental attribution, blind spot, anchoring, representativeness heuristic and projection). It is aiming to reduce bias such as:

- looking at 2 candidates for a position. One candidate has good references and experiences but is very nervous in the interview, while the other candidate is a better communicator, ie talks about topics that interest you but his performance record is mediocre. Will you overcome the fundamental attribution error and hire the first candidate?

- you are looking at employing someone you despise for reasons of temperament, behaviour and ideology. On the other hand, his performance is excellent. Will you dislodge your powerful confirmation bias and employ that person?

Cognitive bias can also be linked with the use of artificial intelligence (AI). For example, Google is perceived as a search engine which is neutral, ie the company marshalls computers and maths to objectively sift truth from trash. On the other hand, Google's activities are determined by humans who have opinions and blind spots, and have to work within an organisational structure with its corporate goals (financial, political, etc). As people use artificial intelligence tools that learn from real-world data, there is the potential to amplify the many biases found in society, even unbeknown to its creators. With Google there is concern

"...about one monopolistic platform controlling the information landscape..."


Safiya Noble as quoted by Farhad Manjoo, 2018

"...in the United States, 8 of 10 web searches are conducted through Google. Google also owns other major communication platforms, among them YouTube and Gmail, and it makes the android operating system and its App Store. It is the world's dominant Internet advertising company, and through their business, it also shapes market for digital news...... the important question is how it manages that power, and what checks we have on it......Google's influence on public discourse happens through priming algorithms, chief among them the search term term and which result you see in its search engine. These algorithms are secret......because explaining the precise way the algorithms work would leave them open to being manipulated...... this initial secrecy creates a troubling opacity. Because search engines take into account the time, place and some personalised factors when you search, the result you get today will not necessary match the results I get tomorrow. This makes it difficult for outsiders to investigate bias against Google's results..."


Farhad Manjoo, 2018

More on cognitive bias:

. Confirmation bias, ie confirming evidence, is linked with associated memory. For example, people prefer to seek data, information, etc that is compatible with their beliefs rather than those that will challenge them.

. Halo bias, ie the tendency to like, or dislike, everything about one a person or thing (this includes what has not necessarily been observed). This bias may result from a reaction to a person's voice or appearance. It is heavily reliant on first impressions and this can be determined by chance. The sequence matters as the first impression dominates the subsequent impressions. The gaps in your knowledge about the personal situation are filled in by guesses that fit one's emotional response to the personal situation. At the same time, as you become more familiar with the person or situation, evidence accumulates and attaches to your first impression. So sometimes a halo effect suppresses ambiguity and exacerbates the initial mis-impression.

WYSIATI

. WYSIATI (What You See All There Is) - how the mind uses information that is currently available and unavailable. An essential design feature of associative machine is that it represents only activated ideas. Thus information that is not retrieved, even unconsciously, from the memory might as well not exist. Routine focus of the brain excels at constructing the best possible story that incorporates only ideas currently activated, irrespective of their accuracy and incomplete information. Routine focus of the brain

"...Is radically sensitive to both quality and quantity of information and gives rise to impressions..."

Daniel Kahneman 2012

. Research has shown (Daniel Kahneman 2012) that when people are given only one side of evidence, they are more confident in their judgments than those who understand both sides, ie they are convinced by the coherence of the story that they managed to construct from available information, ie

"...It is the consistency of the information that matters for a good story, not its completeness...... You will often find that knowing little makes it easier to fit everything you know into a coherent pattern..."

Daniel Kahneman 2012

"...WYSIATI facilitates the achievement of coherence and of cognitive ease that causes you to accept a statement as true. It explains why we can think fast and how we are able to make sense of partial information in a complex world. Much of the time, the coherent story put together is close enough to reality to support reason or action......to help explain a long and diverse list of biases of judgement and choice..."

Daniel Kahneman 2012

. Some of these WYSATI habits include

- overconfidence (neither the quantity nor the quality of the evidence counts much in subjective confidence; the quality of the story is actually more important than the facts; consequently, critical evidence, doubts and ambiguity are often ignored)

- framing (different ways of presenting the same information often evoke different emotions, eg survival rate versus mortality rate from surgery. Survival rate puts a positive spin; while mortality puts a negative spin)

- base-rate neglect (stereo-typing, such as making personality generalisations, eg all doctors have the same personality, etc)

A cognitive bias test, ie Which line is shorter?

line1

 

Most people will choose "A" but both lines are the same length!!!

There is a neurological basis for this illusion, ie we like to make quick decisions

Search For Answers

designed by: bluetinweb

We use cookies to provide you with a better service.
By continuing to use our site, you are agreeing to the use of cookies as set in our policy. I understand