Scenario Planning (Cont.) Control Of Our Fate Five Main Reasons We Fail To Anticipate Events

According to Nassim Taleb (2007), there are 5 main reasons (errors of confirmation, narrative fallacy, human nature, distortion of solid or silent evidence and tunnel vision) we fail to see unexpected events:

i) errors of confirmation or problems of conductive knowledge or learning backward - we focus on experience and preselected observations, and then generalize from them to the unseen; we tend to look at what confirms that knowledge. This is knowledge gained from observation. We hope we can know the future with some certainty, given our understanding of the past. Yet what we learn from the past can turn out to be irrelevant or false for the future, so we need to be careful of our habits and conventional wisdom. Our tendency to generalize can lead to dangerous stereotyping and discrimination.

"...making a nae observation of the past as something definitive or representative of the future is the one and only cause of our inability to understand..." the future

Nassim Taleb, 2007

"...we are all motivated to maintain a sense of psychological safety by nurturing a positive self-image, by looking at the world as a knowledgeable and predictable place, and by avoiding risk. This can lead to an overestimation of the self and a habit of attending only to information that bolsters our existing beliefs..."

Boris Groysberg et al, 2010

Remember the statement of Captain Smith of the Titanic about his exemplary safety record before the fateful journey!!!!!!

This is linked with domain specificity and nae empiricism

- domain specificity - means that we react to a piece of information based on the framework that surrounds it and its social-emotional situation, rather than logical merit

- naive empiricism ‐ we have a natural tendency to look for instances to confirm our perceptions, ie

"...take the past instances that corroborate your theories and treat them as evidence......it is misleading to build a general rule from observed facts......sometimes a lot of data can be meaningless; and at other times one single piece of information can be very meaningful......once your mind is inhabited with a certain view of the world, you will tend to only consider instances proving you to be right. Paradoxically, the more information you have, the more justified you will feel in your views..."

Nassim Taleb, 2007

This asymmetry of knowledge is important as it provides insight into the unpredictability of the world, ie all pieces of information are not of equal importance.

ii) narrative fallacy - we believe that stories will display distinct patterns; we fool ourselves with stories and anecdotes. It is

"...our predilection for complex stories over raw truths. It severely distorts our mental representation on the world; it is particularly acute when it comes to a rare event......addresses our limited ability to look at sequences of facts without weaving an explanation into them, or, equivalently, forcing a logical link, an arrow of relationship, upon them ... They make them all the more easily remembered; they help them make more sense. Where this propensity can go wrong is when it increases our impression of understanding..."

Nassim Taleb, 2007

Furthermore,

"...the more random information is, the greater the dimensionality, and thus the more difficult to summarize. The more you summarize, the more order you put in, the less randomness. Hence the same conditions that make us simplify pushes us to think that the world is less random than it actually is..." and the less you are able to predict the future and handle unexpected events

Nassim Taleb, 2007

Remember: facts do not change but people's perceptions and/or interpretation of the facts do, ie perception distortion. Furthermore, we are better at explaining than understanding.

iii) human nature (we are programmed to handle the expected rather than the unexpected; how our emotions get in the way of a reference)

We have a tendency to reduce information into categories and store it in our brains, rather than looking outside our information set, judgments and explanations. This is partly explained biologically, as it is expensive (energy wise) to put information into our brain, costly to store it and costly to manipulate and retrieve it. Furthermore, parts of the brain are important in distinguishing instantaneous, emotional reactions (limbic) from thinking responses (cortical).

Our working memory has limited holding capacity, eg we have difficulty remembering telephone numbers that exceed seven digits. Thus compression and patternising of information, ie dimension reduction, is vital to the performance of conscious work. We selectively remember facts about the past that suit our point of view and conveniently forget other facts that challenge our views.

Both causality and narrativity are symptoms of the dimension reduction.

- causality suggests a chronological dimension that leads to the perception of the flow of time in a single direction. Our emotional makeup is designed for linear causality. With relationships between variables are clear, crisp and constant; yet the world is not - it is more non-linear, asymmetrical in its relationships and consequences. Furthermore, the appearance of busyness reinforces the perception of causality ‐ the link between results and one's role in them.

- narrativity allows us to see past events in a more predictable, more expected and less random way.

Thus memory is dynamic and not fixed, static or constant. This allows for perception and retrospective distortions, ie

"...memory is more of a self-serving dynamic revision machine: you remember the last time you remembered the event and, without realizing it, change the story at every subsequent remembrance. So we pull memories along causative lines, revising them involuntarily and unconsciously......a memory corresponds to the strengthening of connections from an increase of brain activity in a given sector of the brain ‐ the more active, the stronger than memory......because your memory is limited and filtered, you will be inclined to remember those data that subsequently match the facts..."

Nassim Taleb, 2007

Our happiness depends more on the number of instances of positive feelings rather than their intensity. This is called the positive effect. For example, in businesses the accounting period is too short to reveal whether or not performance is good or otherwise; yet management is judged on the short-term indicators.

"...But we do not live in an environment where results are delivered in a steady manner..."

Nassim Taleb, 2007

Furthermore,

"...humans will believe anything you say provided you do not exhibit the smallest shadow of diffidence......they can detect the smallest crack in your confidence..."

Nassim Taleb, 2007

iv) distortion of solid or silent evidence (we see what we want to see, ie these mis-perceptions become our reality, eg how we are selective in our understanding of history)

"...silent evidence is what events use to conceal their own randomness..."

Nassim Taleb, 2007

A subset of distortions is bias, ie

"...the difference between what you see and what is there..."

Nassim Taleb, 2007

We like to categorize things but don't consider the fuzziness of the boundaries between categories, ie

"...categorizing always produces reductions in true complexity......any reduction of the world around us can have explosive consequences since it rules out some sources of uncertainty......underestimate the impact of the highly improbable..."

Nassim Taleb, 2007

Fuzziness is the very essence of uncertainty.

Thus we assume that the world we live in is more understandable, more explicable, less irregular and more predictable than it actually is. In fact,

"...we are just a great machine for looking backward......humans are great at self delusion..."

Nassim Taleb, 2007

Yet history runs forwards, not backwards!!!

Silent evidence can cause a distortion at both ends of the spectrum, ie an overestimation or an underestimation.

Lucid fallacy refers to the elements of uncertainty that we face in real life that have little connection to the ones we encounter in the classroom. Computable risks calculated in the classroom are largely absent from real-life.

v) tunnel vision (we focus on a narrow range of well-defined sources of uncertainty; the difference between what people actually know and how much they think they know; the neglect of outside/external sources of uncertainty)

"...we are too narrow minded a species to consider the possibility of events straying from our mental projections..on matters internal to the project to take into account external uncertainties, the unknown unknown..."

Nassim Taleb, 2007

Many important breakthroughs, such as computers, Internet, laser were unplanned, unpredicted and initially not appreciated. Despite our improved ability to use predictive models, our success rate in forecasting the future is not very good.

Furthermore,

"...we are demonstrably arrogant about what we think we know ... we have a built-in tendency to think that we know a little more than we actually do." This can get us into serious trouble..."

Nassim Taleb, 2007

We suffer from epistemic arrogance, ie as our knowledge grows, so our confidence increases significantly. This can result in over-confidence which causes an increase in confusion, ignorance and conceit. In fact

"...epistemic arrogance bears double effects: we overestimate what we know, and underestimate uncertainty, by compressing the range of possibile uncertain states, ie by reducing displays of the unknown......longer the odds, the larger the epistemic arrogance..."

Nassim Taleb, 2007

Furthermore, we have a tendency to favour underestimating the impact of unexpected events. This is more pronounced the further we are away from the event.

There is little difference between guessing and predicting, ie

- guessing (what I don't know, but what somebody else may know)

- predicting (what has not taken place yet)

Giving people more information does not necessarily improve the decision-making. People will select information that confirms their point of view (confirmation bias) and will suffer from belief perseverance (the tendency not to change opinions we already have). Many experts are narrowly-focussed people who suffer from a combination of confirmation bias and belief perseverance. Furthermore,

"...The problem with experts is that they do not know what they do not know..."

Nassim Taleb, 2007

In fact, many experts are worse predictors than amateurs!!!!!

"...experts were lopsided: on the occasions when they are right, they attributed it to their depth of understanding and expertise; when wrong, it was either the situation that was to blame, since it was unusual, or, worse, they did not recognize that they were wrong and spun stories around it. They found it difficult to accept that their grasp is a little short......humans are the victims of an asymmetry in the perception of random events. We attribute our successes to our skills, and our failures to external events outside our control, namely their randomness..."

Nassim Taleb, 2007

Furthermore,

"...statistically sophisticated or complex methods do not necessarily provide more accurate forecasts than simpler ones......the problem is that we focus on the rare occasions when these methods work and almost never on their far more numerous failures..."

Nassim Taleb, 2007

Linked with tunnel vision or "tunnelling" is anchoring, ie

"...you lower your anxiety about uncertainty while producing a number, then you anchor onto it......use reference points in our heads.....start building beliefs around them because less mental effort is needed to compare an idea to a reference point than to evaluate it......we cannot work without a point of reference..."

Nassim Taleb, 2007

In summary, when looking at history we suffer from randomness (incomplete information) or "triplet of opacity", ie

"...a. an illusion of understanding, or how everyone thinks he knows what is going on in a world that is more complicated (or random) than he realizes;

b. the retrospective distortion, or how we can assess matters only after the fact, as if they were in a rearview mirror (history seems clearer and more organized in history books than in empirical reality); and

c. the over valuation of factual information and the handicap of authoritative and learned people, particularly when they create categories..."

Nassim Taleb, 2007

Many organisations need to review their mindsets and practices to help them survive in uncertain times. With an increasingly unpredictable, complicated and volatile environment, many accepted practices and core businesses (products and/or services) need reviewing. Beware of making decisions which are based on old assumptions, eg high barriers to entry, high transaction costs, few capable competitors, growing and increasingly affluent markets, restricted information flows.

Generally human beings have a tendency to embrace information that reinforces their pre-existing views, while challenging or rejecting information that questions these views.

Many established management tools, such as net present value, are built on a foundation that assumes certainty, ie forecasting likely cash flows and discounting them. In a volatile business environment, this thinking is not advisable

Our traditional approach to handling uncertainty and the resultant chaos is to introduce more rules and regulations !!!!!!! This does not work.

One way to handle uncertainty and unexpected events is to have a wide spread, or diversification, of your exposure to risk, ie a small percentage in risky and speculative ventures and the balance in less risky and more conservative activities.

Furthermore, you need to develop ways to work around the inherent unpredictability and even exploit it, ie handle the unknown unknowns. Some recommendations include

- make a distinction between positive and negative contingencies - negative ones can hit hard and hurt severely, eg a big budget movie that is a box office failure. Positive ones can involve losing small to gain big, eg a new, cheap book that has the potential to be a bestseller. You need to know where your ignorance lies and have a precise understanding of the structure of uncertainty.

- don't look for the precise - remember that chance favours the prepared, and invest in preparedness, not the prediction. Making predictions tends to narrow our focus and makes us more vulnerable to the events that we do not predict.

- be very opportunistic - strenuously chase opportunities and maximize exposure to them. This stresses the importance of networking.

- avoid people who make predictions and be wary of planners - remember that planners, especially governments and their public servants, are not good at making accurate predictions

NB

"...I will never get to know the unknown since it is unknown. However, I can always guess how it might affect me, and I should base my decisions around that......the probability of a very rare event is not computable; the effect of an events on us is considerably easier to ascertain......we can have a clear idea of the consequences of an event, even if we do not know how likely it is to occur......this idea that in order to make a decision you need to focus on the consequences (what you can know) rather than the probability (which you cannot know) is an essential idea of uncertainty..."

Nassim Taleb, 2007

People like Warren Buffett (Barrie Dunstan, 2009) observed that the lesson learned from experience is that we learned nothing from experience!!!!!!! By the time the lessons are needed, a new generation has either forgotten them or not been taught them. Furthermore, this leads into the phenomenon "creeping determinism", ie

"...the sense that grows on us, in retrospect, that what has happened was actually inevitable - and the chief effect of creeping determinism...... is that it turns unexpected events into expected events..."

Malcolm Gladwell, 2009

The GFC has discredited most mathematical models that endeavour to forecast the future, especially those that used the activities of the past to predict the future and have not incorporated the psychological elements of human behaviour in decision-making. This is linked with the rational-irrational dichotomy and optimistic-pessimistic distinction. We tend to swing from irrational-pessimism, ie doom and gloom, to irrational-optimism, ie exuberance that encourages uncontrollable speculation and risk taking. Furthermore some of the assumptions are not valid, eg most macroeconomic frameworks have treated institutions, like Fannie Mae and Freddie Mac, as neutral. Based on what has happened in GFC, these institutional frameworks are far from neutral in their impact. Fannie Mae and Freddie Mac handle around 50 percent of all mortgages in the United States and they got into financial strife during the GFC.

Some more thoughts on the deficiencies of conventional, traditional financial modelling, ie

i) markets are not efficient despite the use of frameworks around the efficient market hypothesis (EMH), such as capital asset price model, the Black-Scholes option pricing model, modern risk management techniques, market-to-market accounting, market cap indexing, concept of shareholder value. Even the US Federal Reserve Bank fell under the spell of "markets know best".

ii) evaluating relative performance, via the use of, for example, alpha and beta (active return and market return), etc results in managers' tendency to over-diversify because of their fear of underperforming against the benchmark. The aim should be to maximize total returns after-tax and should be to maximize rather than to benchmark. The only way to produce a superior performance is to do something different

iii) this time it is different - remember that no one has the ability to predict the future with accuracy. Furthermore, a behavioural bias that we can influence the outcome of uncontrollable events, interpreting information in a way that supports self-interest and with a common focus on the short-term, will provide an illusion of control

iv) valuation matters - indicators such as price/earnings ratio are useful, eg buy stock when ratio is low, and sell when ratio is high. Need to keep emotions and sentiment away from decision-making as that swings from greed to fear

v) adopt a disciplinary approach - be patient and wait for the best time to buy rather than chasing every swing

vi) be careful of leverage - it can turn a good investment bad.

vii) complex mathematical models can hide the real risks - this results in obsession with needless complexity, ie

"...mathematics is......considered as producing precise and dependable results: but in the stockmarket the more elaborate and abstruse the mathematics, the more uncertain and speculation are the conclusions..."

Benjamin Graham as quoted by Barry Dunstan, 2010

viii) macro picture matters - need to understand macro and micro approaches

ix) cheap insurance - this is useful in a portfolio as it protects us from the known unknowns, such as inflation, bad monetary policy, poor government decisions, etc

x) most models are based on assumptions that are generally assumed to be fixed and stable. If these assumptions are taken to be random and/or changeable, most conventional models are of limited use. For example, the notion of competitive advantage, ie countries should focus on what they do best. Yet if commodity prices fluctuate, this competitive advantage might no longer be advantageous. Furthermore, the notion of competitive advantage is a basis for globalisation, ie efficiency, but in reality the systematic imperfections can distort this.

Another example is the attitude to debt. A positive attitude towards debt implies confidence in the future and a high degree of reliance on forecast. Yet

"...forecasting is harmful since people (especially governments) borrow in response to a forecast (or use the forecast as a cognitive excuse to borrow)...... borrowing makes you more vulnerable to forecast error..."

Nassin Taleb, 2010

Need to be careful of the concept of socialisation of losses and privatisation of gains. Alternatively, if a bank needs to be bailed out it should be nationalised; otherwise banks should be free, small and risk-bearing.

Furthermore, according to Taleb (2010), we

- need to make sure that any incentive or bonus system includes a disincentive for poor performance. Currently incentives system are asymmetrical, ie reward positive performance but no disincentives for poor performance.

- the complexity from globalisation and highly networked economic life needs to be countered by simplicity in financial products. Most complex financial products (eg hedging products) are not fully understood, and as a result should be banned

- governments should not need to restore confidence; the system should be robust enough to handle adverse rumours

- need to be careful of using leverage to handle our debt crisis as it is not a temporary problem; rather it is a structural one

- the market is not the final arbitrator

- need to look at converting debt into equity, marginalizing the economics and business school establishments, banning leverage buyouts, reducing the bonus system, reducing risk-taking amongst bankers, educating people to handle uncertainty and not allowing organisations to become too big to fail

Mother Nature is a complex system that has developed ways to handle the unknowns. It is a

"...webs of interdependence, non-linearities and a robust ecology (otherwise it would have blown up a long time ago)..."

Nassin Taleb, 2010

- it has developed backups, eg in the human body we have 2 eyes, 2 lungs, 2 kidneys, etc. These backups are insurance, eventhough there are obvious inefficiencies in costs and energy usage in maintaining these spare parts.

- does not like over-specialization as it limits evolution and weakens the system

- works against largeness. For example, if one removes a large land animal like an elephant, the whole eco-system does not collapse. Yet the fear that one large bank failure (Lehman Brothers) could bring down the entire system was obvious in 2008.

- robustness is important as we are unable to correct mistakes and eliminate randomness from social and economic life. The challenge is to confine this like Nature does.

 

Search For Answers

designed by: bluetinweb

We use cookies to provide you with a better service.
By continuing to use our site, you are agreeing to the use of cookies as set in our policy. I understand