Ii) Unable To Handle The Unexpected/Uncertain/Highly Improbable/Unforeseen Consequences, Ie Chaos Is Part Of Life

Introduction

"...human nature hasn't changed a bit. What has changed is the environment we live in..."

Seth Godin, 2007

Human beings are creatures of habit and prefer to stay in their zones of comfort. As a result, we are not good at handling the unexpected and uncertainties that lie outside our 'tunnel of possibilities. Usually the unexpected and unknown events are rare, have an extremely high impact and are low in predictability; although they appear retrospectively predictable. We need to adjust to handle them. Usually the problem lies not in the nature of the events but in the way we perceive them.

"...no matter how much research and planning you do, you will never perfectly predict how the market......will unfold..."

Jayne Herdlicks as quoted by Joanna Gray, 2015/2016

Furthermore,

"...you must never forget that every change ushers in unforeseen consequences. This applies as much to welcome changes as unwelcome ones......obviously, you cannot plan for the unexpected. All you can really do is never let your guard down..."

Richard Branson, 2008

Linked with a preference to stay in your zone of comfort is a fixed mindset (Catherine Fox, 2009). A fixed mindset is a simple framework for gaining self-esteem and judging others. It encourages stereotyping by using preliminary information to decide on a fixed view. Fortunately, our brain continues to change throughout our lives. This gives us a chance to update information for better decision-making. We need to encourage a culture where we learn from our mistakes. Furthermore, when we are willing to learn, we are more receptive to feedback or criticism. Thus success depends upon effort, persistence and being prepared to move out of your comfort zone rather than being complacent about innate talent.

In spite of our exponentially increasing knowledge base, the future is becoming less predictable. Associated with this is a lack of understanding of randomness, particularly large deviations, and a preference for anecdotal over empirical evidence. In practice, randomness is fundamentally incomplete information. A truly random system has unpredictable properties while a chaotic system has entirely predictable properties.

There are 2 types of randomness, ie

i) generally if your sample is large enough, no single instance will significantly change the aggregate or the average, ie

"...endure the tyranny of the collective, the routine, the obvious, and the predicted..."

Nassim Taleb, 2007

ii) inequalities, such as one single observation can have a disproportionate impact on the average or the total, ie

"...that tyranny of the singular, the accidental, the unseen, and the unpredicted..."

Nassim Taleb, 2007

(NB Strategies to handle randomness as expressed by volatility, risk, uncertainty, etc include spreading your exposure to reduce vulnerability to, and/or reliance on, one or few activities, and encouraging diversity in thinking and experience.) Assign a price to risk; risk is related to uncertainty and/or volatility. The more risky the activity, the greater the expected rate of return, ie risk premium. Link risk and return via cost benefit analysis of risk premiums.

Sometimes changes in technology have unexpected impacts. For example, Viagra was initially tested as a hypertension drug. To have an appreciation of the impact of unexpected events, we need to understand the political, sociological, demographic, etc fads and trends.

"...Owing to the growth of scientific knowledge, we overestimate our ability to understand the subtle changes that constitute the world, and what weight needs to be imparted to reach such change..."

Nassim Taleb, 2007

Remember: people do not act in rational ways. Thus to predict behaviours and reactions is not easy.

i) people choosing not to save for old age by spending their superannuation quickly after retirement

ii) exposing themselves to addictive substances, ie get short-term "kick" but long-term problem, eg smoking cigarettes (additive nicotine and other health issues like cancer, heart, diabetes, etc)

iii) people sunbaking to become tanned in the short term but exposing themselves to risk of skin cancer in the long term

Risk homeostasis (under certain circumstances, changes that appear to make a system for an organisation safer in fact do not). The rationale for this is that we have a fundamental tendency to compensate for low risk in one area by taking greater risk in another. For example, the introduction of childproof lids on medicine bottles led to a substantial increase in fatal child poisoning. The reason for this was that adults became less careful in keeping medicine bottles out of the reach of children. But it can work in an opposite direction. In the late 1960s, Sweden changed from driving on the left-hand side of the road to driving on the right. This was expected to increase the accident rate. In fact, initially road fatalities dropped by 17 percent because people drove more carefully to compensate for their unfamiliarity with the new traffic patterns

Unexpected events that have negative impacts happen more quickly than those that have positive impacts. Furthermore, the unexpected events provide an exploitative opportunity for entrepreneurs.

(As an aside, research has shown that venture capitalists, and not the entrepreneurs, make all the money!!!!!!)

As there are many subtleties, including chance, there is an asymmetry between using the past to determine the future. Owing to this introspective defect, we incorrectly think of tomorrow as a projection of another yesterday. Furthermore, a small input in a complex system can lead to large non-random consequences, depending upon the initial conditions, etc

Control of our fate

Despite our belief that we can control our own fate (Cassius's idea), our supply of data/information and the use of computer modelling, etc, we are not very good at making accurate predictions. For example

- meteorologists have a bad reputation for the accuracy of their predictions of what future weather will be like

- the terrorist attacks (9/11) were not predicted with any degree of accuracy.

- the attack on Pearl Harbor in the 1940s was not predicted with any degree of accuracy

- the global financial crisis (starting in 2007) was not predicted with any degree of accuracy

- just prior to the presidential election in 2000, computer modelling predicted that Al Gore would win by a landslide; but George W Bush won

- the presidential election (2012) between Barak Obama and Mitt Romney was predicted to very close but Obama won very convincingly

- starting in 1960s, most experts did not predict developments or misread events involving

i) Deng Xiaoping (purged in 1966 during the Cultural Revolution and re-instated in 1973 to lead the re-emergence of China)

ii) Ayatollah Khomeini (living in exile in Iraq/France before leading the Islamic revolution in Iran)

iii) Margaret Thatcher (a junior education minister & later the first female PM of UK)

iv) Karol Jozef Wojtyla (Archbishop of Cracow (Poland) during Russian domination & later becomes the first non-Italian Pope (John Paul II) since Adrian VI (1522))

- research by Philip Tetlock found that despite

"...political scientists claiming that a political outcome has absolutely no chance of occurring, it nevertheless happens around 15% of the time..."

Nate Silver, 2012

- attempts to predict earthquakes by using highly sophisticated, mathematical and data-driven techniques have proved inadequate, eg the Fukushima nuclear reactor was designed to handle a magnitude 8.6 earthquake based on seismologists' best predictions, ie anything larger than 8.6 was supposedly impossible. In March 2011, Japan was hit by an earthquake of magnitude 9.1. Furthermore, the building of sea walls to prevent damage to coastal villages from tsunamis caused by earthquakes made the situation worse, ie the water flowed over the top of the walls and the walls prevented the water flowing back to the sea. The walls were built on historical evidence of the past tsunamis. The one in 2011 was considerable worst than previously recorded.

organisational development change management

(source: Henry Tricks, 2013)

The watch industry is a good example of unpredictability, ie who would have believed that
- men would ditch the pocket watch for the more feminine "wristlet" version
- in the 1970s, Swiss watch industry was decimated by the quartz time piece made famous by Japan's Seiko. This less expensive and more accurate watch marked the end of the traditional industry with mechanical cogs-and-spring movement, ie it is estimated that 3/4 Swiss watchmaking workforce and their machinery disappeared. There was no certainty that the mechanical watch was going to survive. The Swiss watch industry recovery was based on the watch becoming a desirable luxury item like the Omega Speedmaster watch that went to the moon and is still a huge success in 2016.  Nicolas Hayek Snr was the man responsible for introducing a face-saving Swatch and consolidating the Swiss industry
- smart or connected watch, ie in addition to telling the time, being a luxury item, etc the watch had become a mini computer (Bani McSpedden, 2016)

Another example of poor predicting is Chinese steel production. Two of the world's largest miners, eg BHP Billiton and Rio Tinto, with the help of consulting group Kinsley in 2007 predicted that China's annual steel production would reach 1 b. tonnes between 2025 and 2030. This prediction was 60% higher than China's production in 2010. It was used by private organisations and governments like the Asian Century White Paper (2012), Australian Treasury, Reserve Bank of Australia, etc, in their planning for the outlook of federal budgets and for thinking on monetary policy, ie encouraged expenditure, like infrastructure, and tax breaks that were unsustainable. This prediction resulted in unrealistic expectations. Recent events have seen a weak steel demand in China, eg peaked at 823 m. tonnes in 2014 and expected to fall by 15% in 2015, and the price of iron ore hovering around a decade low of US $53 per tonne. It is now thought that the figure overstated annual demand by 350 m. tonnes for 2030!!!!! This is a classic case of expecting what has happened in the past to be repeated in the future, ie

"...as a country raced to build enough apartments, railways, airports, cars and household appliances for the more than 450 m. people who flocked from villages to the cities, steel production soared. It went from an average annual growth of 7% during the 1980s to 10% during the 1990s and close to 20% in 2000s. At the start of the past decade, China made up 15% of the global steel production, now it accounts for around half..."

Angus Grigg et al, 2015

Chinese growth has stalled especially with the weak property market where construction accounted for most of its steel production.

After being exposed to predictions, people can change their behaviour as a result of the prediction and this can impact on the accuracy of the prediction, ie becomes a self-fulfilling prophecy

Key to good decision-making is prioritising

Accurate predictions very difficult, eg

Weather forecasting

- it is a dynamic system - everything affects everything else & systems are in perpetual motion

- uncertain initial conditions

- poor data

(NB Do not mistake statistical correlation with causation, eg positive correlation between electricity poles & heart disease)

Change is like weather & economic forecasting, ie very hard to make accurate predictions, ie

"...there are too many factors to lay down fixed rules..."

David Hains

"...the amount of knowledge in the world is increasing, the gap between what we know and what we think we know may be widening..."

Nate Silver, 2012

Over a 15 year period (starting in the 1980s) it was found that expert opinions, regardless of the field of expertise, were not much better than random chance in the accuracy of their predictions, from work by Philip Tetlock as quoted by Nate Silver (2012),

Our reliance on computer modelling has demonstrated the fragility of the conclusions which stem from initial choices of assumptions, variables, etc. Models are only as good as the assumptions that went into them, ie if the assumptions are wrong, so are the predictions. Situations can change that make assumptions no longer applicable. No model is perfect, ie

i) economists forecasting recessions - in the 1990s, of the 60 recessions around the world, only 2 had been predicted 12 months in advance

ii) If the US Federal Reserve's forecasts had been realised over the last 4 years, then by 2014 the US economy should have been $US 1 t. larger

iii) most projects take longer to complete than planned and are over-budget

Our reliance on computer modelling has demonstrated the fragility of the conclusions which stem from initial choices of assumptions, variables, etc. Models are only as good as the assumptions that went into them, ie if the assumptions are wrong, so are the predictions. Situations can change that make assumptions no longer applicable. No model is perfect, ie

"...models that work today can break tomorrow, with no warning and no explanation..."

James Weatherall, 2013

Need to distinguish between risk and uncertainty. Risk is a "known unknown" while uncertainty is an "unknown unknown". This concept refers to

- "known known" refers to when a question has an exact answer

- "known unknown" refers to a question with an imprecise answer

- "unknown unknown" refers to when we don't know what question to ask, ie it is not being considered; it can not be imagined; it is as though it does not exist

As binary categories of either "predictable" or "unpredictable" are very rare, we need to handle uncertainty by using probability, ie based on past experience, there is a certain probability of the event happening again

We need to be careful not to mistake "unfamiliar" for the "improbable" or "unlikely" or "unimaginable", eg

- Pearl Harbour attack by Japan in WW2

- Cuban missile crisis

- 9/11 terrorist attack on World Trade Centre

Some financial "unknown unknowns" examples and their impacts

- Switzerland (January 2015) - Swiss National Bank abandons its 3-year currency cap of 1.20 franc to the euro, sending the currency skyrocketing by more than 30% against the euro in minutes
- Russia (December 2014) - Russian Central Bank increases interest rates from 10.5 to 17 percent (the largest one-day increase since 1998 Russian financial crisis). It was designed to store up the collapsing Russian ruble and stave off skyrocketing inflation, but instead triggered further panic in the Russian markets and the ruble plummeted, hitting a low of 80 against the US dollar
- USA (1994) - US Federal Reserve increased interest rates by 200 basic points in a series of unexpected decisions over a two-month period. This impacted bond portfolios and hedge funds; banks plunged into the red
- Britain (1992) - when George Soros placing a US $ 10 b. speculative bet against the UK pound, the Bank of England countered by hiking interest rates by nearly 5 percentage points to 15%; it failed. Sterling was pulled out of the European monetary system and the pound collapsed. Soros's profit was estimated to be over £1 b.; UK Treasury lost £3.4 b.
- Australia (1973) - Australian import tariffs were cut by 25% by the new Labour government led by Gough Whitlam, resulting in hundreds of Australian factories shutting down, manufacturing output plunged by 10% in the year, and over the next 5 years, almost 200,000 manufacturing jobs were lost
  - USA (1971) - US President Richard Nixon uncoupled the US dollar from the gold price; this ended the Bretton Woods system of fixed exchange rates established at the end of World War II. Other currencies were no longer able to peg to the gold standard, giving rise to the notion of globally floating exchange rates for major currencies.

Another example of unexpected events impacting one business is James Packer's casino business. In January 2015, the unexpected change in President & government in Sri Lanka crushed Packer's proposed hotel and casino in the capital, Colombo, as Packer was seen as too close to the previous regime. Then the unexpected Queensland election result in early 2015, resulting in a change in government has delayed the tender for the new casino and entertainment complex in Brisbane. Followed by delays in building Crown Sydney - a $2 b. hotel and casino plan for Barangaroo waterfront development area. None of these events were easily foreseeable!!!!

This is called "mind-blindness" when we believe that we can control our own fate, ie we can control nature (not live in harmony with it)

Chaos Theory

It is about the tipping point at the edge. It assumes that the system is

- dynamic (the system's behaviour at one point in time influences its future behaviour, ie everything affects everything else & systems are in perpetual motion)

- nonlinear (follows an exponential rather than additive relationship)

- Some major event can be the accumulation of many small, inter-connected events, ie a chain reaction

- there is inevitable divergence of all but identical initial states as they evolve over time, but small differences in initial conditions can produce very great ones in the final phenomena. Eg from his computer model, Edward Lorenz by rounding 6 digits (0.452386) to 3 (0.452) produced greatly different weather results

Complexity theory

Complexity theory is a way of making sense of advanced technologies, globalisation, intricate markets, cultural change, etc

Organisations have gone from complicated to complex

There is duality, ie fight and oscillation between order & disorder (chaos) that can result in events seeming both very predictable & very unpredictable at the same time

There are many diverse, interdependent parts interacting and they are in constant flux so that the final outcome is unknown, ie very simple things can behave in strange & mysterious ways when they interact with one another

The same starting conditions may yield different results

Seemingly simple actions may produce unexpected and/or unintended consequences, ie long periods of apparent stasis marked by sudden change that are very hard to predict

Rare events are becoming more significant than average ones

This concept applies to a system which has the following characteristics

"... - a lot of interacting elements - the interactions are nonlinear, and minor changes can produce disproportionately major consequences

- the system is dynamic, the whole is greater than the sum of its parts, and solutions cannot be imposed; rather, they arise from the circumstances. This is frequently referred to as emergence

- the system has a history, and the past is integrated with the present; the elements evolve with one another and with the environment; and evolution is irreversible

- a complex system may, in retrospect, appear to be ordered and predictable; hindsight does not lead to foresight because the external conditions and systems constantly change

- unlike in ordered systems (where the system constrains the agents), or chaotic systems (where there are no constraints), in a complex system the agents and the system constrain one another, especially time. This means that we cannot forecast or predict what will happen......More recently, some thinkers and practitioners have started to argue that human complex systems are very different to those in nature and cannot be modeled in the same way because of human unpredictability and intellect. Consider the following ways in which humans are distinct from other animals:

- they have multiple identities and can fluidly switch between them without conscious thought. (For example, a person can be a respected member of the community as well as a terrorist)

- they can, in certain circumstances purposefully change the systems in which they operate to equilibrium states (think of the six Sigma project) in order to create predictable outcomes..."

David T Snowden et al, 2007

. Need to be careful of assuming that

- statistical correlation means causation

- misleading noise, ie random patterns that can be mistaken for signals, iedata & modeling can hide the true signal/phenomena/trend, etc. For example, the complexity (including variability, risk, uncertainty, etc) of economic data has created much noise through daily, cyclic, seasonal fluctuations, etc that can hide the trends &/or generate conflicting meanings

- usually all the signals are present but we cannot read them correctly. The problem is not a lack of information but a failure of accurate prediction

- importance of luck

Some Impacts of Chaos & Complexity

. Use simplification, generalisation &/or approximations to understand complex events, eg making assumptions about key factors, rounding off figures, etc. This can be very misleading. This makes accurate predictions, about weather, economic/business cycles, recessions, political outcomes, etc very difficult to achieve, eg

- weather forecasting

i) It is a dynamic system ie everything affects everything else & systems are in perpetual motion

ii) Uncertain initial conditions

iii) Poor data

- economic forecasting

i) Hard to determine cause & effect from economic statistics alone

ii) Economy is not static, so some past explanations may not hold for future situations

iii) Impact of political decisions

iv) Lack of accurate data despite the huge amounts produced, eg the US Govt produces around 45,000 economic indicators annually plus private providers are supplying 4+ million.

. On the other hand, simplification, etc is powerful if it gives better initial understanding of the situation. Then we need to explore the impact of changing the assumptions (especially if they fail) used in the simplification.

. Change is like weather & economic forecasting, ie very hard to make accurate prediction, ie

"...there are too many factors to lay down fixed rules..."

David Hains as quoted by Andrew Cornell, 2009a

Cynefin framework (see earlier section "peak-performance etc organisation" for more details)

Cynefin framework is based on complexity theory and aims to help understand the more unpredictable and complex world. It revolves around 5 contexts:

i) simple context (known knowns) is characterized by obvious, stable and cause-and-effect relationships; the right answer is self-evident. This context requires assessment of the facts or situation, followed by categorising and responding to it

ii) complicated context (known unknowns) contains many right answers. Even though there is a clear relationship between cause and effect, it is not obvious. This situation requires sensing (the facts), analysing and responding

iii) complex context (unknown unknowns) involves the right answers not being obvious; disctinctive patterns emerge that require experimentation; most businesses operating in this context need to probe first, then sense, and then respond

iv) chaotic context, by its name, implies infers that searching for the right answer is pointless; relationships between cause and effect are impossible to determine as they are shifting constantly and no manageable pattern exists, as the 9/11 events illustrate. This situation requires initial action to restore order, sense where stability is present, and then consideration of where to transform the situation from chaos to complexity

v) disorder context applies when it is unclear which of the other 4 contexts is predominant. There are multiple perspectives jostling for dominance; factional leaders are in dispute; discord reigns. This requires breaking the situation into its constituent parts and assigning each to one of the other 4 contexts; then decisions can allow an intervention in contextually appropriate ways

Five main reasons we fail to anticipate events

According to Nassim Taleb (2007), there are 5 main reasons (errors of confirmation, narrative fallacy, human nature, distortion of solid or silent evidence and tunnel vision) we fail to see unexpected events:

i) errors of confirmation or problems of conductive knowledge or learning backward - we focus on experience and preselected observations, and then generalize from them to the unseen; we tend to look at what confirms that knowledge. This is knowledge gained from observation. We hope we can know the future with some certainity, given our understanding of the past. Yet what we learn from the past can turn out to be irrelevant or false for the future, so we need to be careful of our habits and conventional wisdom. Our tendency to generalize can lead to dangerous stereotyping and discrimination.

"...making a naive observation of the past as something definitive or representative of the future is the one and only cause of our inability to understand..." the future

Nassim Taleb, 2007

"...we are all motivated to maintain a sense of psychological safety by nurturing a positive self-image, by looking at the world as a knowledgeable and predictable place, and by avoiding risk. This can lead to an overestimation of the self and a habit of attending only to information that bolsters our existing beliefs..."

Boris Groysberg et al, 2010

Remember the statement of Captain Smith of the Titanic about his exemplary safety record before the fateful journey!!!!!!

This is linked with domain specificity and naive empiricism

- domain specificity - means that we react to a piece of information based on the framework that surrounds it and its social-emotional situation, rather than logical merit

- naive empiricism - we have a natural tendency to look for instances to confirm our perceptions, ie

"...take the past instances that corroborate your theories and treat them as evidence......it is misleading to build a general rule from observed facts......sometimes a lot of data can be meaningless; and at other times one single piece of information can be very meaningful......once your mind is inhabited with a certain view of the world, you will tend to only consider instances proving you to be right. Paradoxically, the more information you have, the more justified you will feel in your views..."

Nassim Taleb, 2007

This asymmetry of knowledge is important as it provides insight into the unpredictability of the world, ie all pieces of information are not of equal importance.

ii) narrative fallacy - we believe that stories will display distinct patterns; we fool ourselves with stories and anecdotes. It is

"...our predilection for complex stories over raw truths. It severely distorts our mental representation on the world; it is particularly acute when it comes to a rare event......addresses our limited ability to look at sequences of facts without weaving an explanation into them, or, equivalently, forcing a logical link, an arrow of relationship, upon them"They make them all the more easily remembered; they help them make more sense. Where this propensity can go wrong is when it increases our impression of understanding..."

Nassim Taleb, 2007

Furthermore,

"...the more random information is, the greater the dimensionality, and thus the more difficult to summarize. The more you summarize, the more order you put in, the less randomness. Hence the same conditions that make us simplify pushes us to think that the world is less random than it actually is..." and the less you are able topredict the future and handle unexpected events

Nassim Taleb, 2007

Remember: facts do not change but people's perceptions and/or interpretation of the facts do, ie perception distortion. Furthermore, we are better at explaining than understanding.

iii) human nature (we are programmed to handle the expected rather than the unexpected; how our emotions get in the way of a reference)

We have a tendency to reduce information into categories and store it in our brains, rather than looking outside our information set, judgments and explanations. This is partly explained biologically, as it is expensive (energy wise) to put information into our brain, costly to store it and costly to manipulate and retrieve it. Furthermore, parts of the brain are important in distinguishing instantaneous, emotional reactions (limbic) from thinking responses (cortical).

Also, fragrances, like scent, go straight to the limbic system and have an immediate effect on mood. It is like music & taste, ie it reminds us of something, somewhere, etc and make sure feel better or worse

Our working memory has limited holding capacity, eg we have difficulty remembering telephone numbers that exceed seven digits. Thus compression and patternising of information, ie dimension reduction, is vital to the performance of conscious work. We selectively remember facts about the past that suit our point of view and conveniently forget other facts that challenge our views.

Both causality and narrativity are symptoms of the dimension reduction.

- causality suggests a chronological dimension that leads to the perception of the flow of time in a single direction. Our emotional makeup is designed for linear causality. With relationships between variables are clear, crisp and constant; yet the world is not - it is more non-linear, asymmetrical in its relationships and consequences. Furthermore, the appearance of busyness reinforces the perception of causality - the link between results and one's role in them.

- narrativity allows us to see past events in a more predictable, more expected and less random way.

Thus memory is dynamic and not fixed, static or constant. This allows for perception and retrospective distortions, ie

"...memory is more of a self-serving dynamic revision machine: you remember the last time you remembered the event and, without realizing it, change the story at every subsequent remembrance. So we pull memories along causative lines, revising them involuntarily and unconsciously......a memory corresponds to the strengthening of connections from an increase of brain activity in a given sector of the brain - the more active, the stronger than memory......because your memory is limited and filtered, you will be inclined to remember those data that subsequently match the facts..."

Nassim Taleb, 2007

Our happiness depends more on the number of instances of positive feelings rather than their intensity. This is called the positive effect. For example, in businesses the accounting period is too short to reveal whether or not performance is good or otherwise; yet management is judged on the short-term indicators.

"...But we do not live in an environment where results are delivered in a steady manner..."

Nassim Taleb, 2007

Furthermore,

"...humans will believe anything you say provided you do not exhibit the smallest shadow of diffidence......they can detect the smallest crack in your confidence..."

Nassim Taleb, 2007

iv) distortion of solid or silent evidence (we see what we want to see, ie these mis-perceptions become our reality, eg how we are selective in our understanding of history)

"...silent evidence is what events use to conceal their own randomness..."

Nassim Taleb, 2007

A subset of distortions is bias, ie

"...the difference between what you see and what is there..."

Nassim Taleb, 2007

We like to categorize things but don't consider the fuzziness of the boundaries between categories, ie

"...categorizing always produces reductions in true complexity......any reduction of the world around us can have explosive consequences since it rules out some sources of uncertainty......underestimate the impact of the highly improbable..."

Nassim Taleb, 2007

Fuzziness is the very essence of uncertainty.

Thus we assume that the world we live in is more understandable, more explicable, less irregular and more predictable than it actually is. In fact,

"...we are just a great machine for looking backward......humans are great at self delusion..."

Nassim Taleb, 2007

Yet history runs forwards, not backwards!!!

Silent evidence can cause a distortion at both ends of the spectrum, ie an overestimation or an underestimation.

Lucid fallacy refers to the elements of uncertainty that we face in real life that have little connection to the ones we encounter in the classroom. Computable risks calculated in the classroom are largely absent from real-life.

v) tunnel vision (we focus on a narrow range of well-defined sources of uncertainty; the difference between what people actually know and how much they think they know; the neglect of outside/external sources of uncertainty)

"...we are too narrow minded a species to consider the possibility of events straying from our mental projections"on matters internal to the project to take into account external uncertainties, the unknown unknown..."

Nassim Taleb, 2007

Many important breakthroughs, such as computers, Internet, laser were unplanned, unpredicted and initially not appreciated. Despite our improved ability to use predictive models, our success rate in forecasting the future is not very good.

Furthermore,

"...we are demonstrably arrogant about what we think we know"we have a built-in tendency to think that we know a little more than we actually do" This can get us into serious trouble..."

Nassim Taleb, 2007

We suffer from epistemic arrogance, ie as our knowledge grows, so our confidence increases significantly. This can result in over-confidence which causes an increase in confusion, ignorance and conceit. In fact

"...epistemic arrogance bears double effects: we overestimate what we know, and underestimate uncertainty, by compressing the range of possibile uncertain states, ie by reducing displays of the unknown......longer the odds, the larger the epistemic arrogance..."

Nassim Taleb, 2007

Furthermore, we have a tendency to favour underestimating the impact of unexpected events. This is more pronounced the further we are away from the event.

There is little difference between guessing and predicting, ie

- guessing (what I don't know, but what somebody else may know)

- predicting (what has not taken place yet)

Giving people more information does not necessarily improve the decision-making. People will select information that confirms their point of view (confirmation bias) and will suffer from belief perseverance (the tendency not to change opinions we already have). Many experts are narrowly-focussed people who suffer from a combination of confirmation bias and belief perseverance. Furthermore,

"...The problem with experts is that they do not know what they do not know..."

Nassim Taleb, 2007

In fact, many experts are worse predictors than amateurs!!!!!

"...experts were lopsided: on the occasions when they are right, they attributed it to their depth of understanding and expertise; when wrong, it was either the situation that was to blame, since it was unusual, or, worse, they did not recognize that they were wrong and spun stories around it. They found it difficult to accept that their grasp is a little short......humans are the victims of an asymmetry in the perception of random events. We attribute our successes to our skills, and our failures to external events outside our control, namely their randomness..."

Nassim Taleb, 2007

Furthermore,

"...statistically sophisticated or complex methods do not necessarily provide more accurate forecasts than simpler ones......the problem is that we focus on the rare occasions when these methods work and almost never on their far more numerous failures..."

Nassim Taleb, 2007

Linked with tunnel vision or 'tunnelling' is anchoring, ie

"...you lower your anxiety about uncertainty while producing a number, then you anchor onto it......use reference points in our heads.....start building beliefs around them because less mental effort is needed to compare an idea to a reference point than to evaluate it......we cannot work without a point of reference..."

Nassim Taleb, 2007

In summary, when looking at history we suffer from randomness (incomplete information) or "triplet of opacity", ie

"...a. an illusion of understanding, or how everyone thinks he knows what is going on in a world that is more complicated (or random) than he realizes;

b. the retrospective distortion, or how we can assess matters only after the fact, as if they were in a rearview mirror (history seems clearer and more organized in history books than in empirical reality); and

c. the over valuation of factual information and the handicap of authoritative and learned people, particularly when they create categories..."

Nassim Taleb, 2007

  • Many organisations need to review their mindsets and practices to help them survive in uncertain times. With an increasingly unpredictable, complicated and volatile environment, many accepted practices and core businesses (products and/or services) need reviewing. Beware of making decisions which are based on old assumptions, eg high barriers to entry, high transaction costs, few capable competitors, growing and increasingly affluent markets, restricted information flows.
  • Generally human beings have a tendency to embrace information that reinforces their pre-existing views, while challenging or rejecting information that questions these views.

Many established management tools, such as net present value, are built on a foundation that assumes certainty, ie forecasting likely cash flows and discounting them. In a volatile business environment, this thinking is not advisable

Our traditional approach to handling uncertainty and the resultant chaos is to introduce more rules and regulations !!!!!!! This does not work.

One way to handle uncertainty and unexpected events is to have a wide spread, or diversification, of your exposure to risk, ie a small percentage in risky and speculative ventures and the balance in less risky and more conservative activities.

Furthermore, you need to develop ways to work around the inherent unpredictability and even exploit it, ie handle the unknown unknowns. Some recommendations include

- make a distinction between positive and negative contingencies - negative ones can hit hard and hurt severely, eg a big budget movie that is a box office failure. Positive ones can involve losing small to gain big, eg a new, cheap book that has the potential to be a bestseller. You need to know where your ignorance lies and have a precise understanding of the structure of uncertainty.

- don't look for the precise - remember that chance favours the prepared, and invest in preparedness, not the prediction. Making predictions tends to narrow our focus and makes us more vulnerable to the events that we do not predict.

- be very opportunistic - strenuously chase opportunities and maximize exposure to them. This stresses the importance of networking.

- avoid people who make predictions and be wary of planners - remember that planners, especially governments and their public servants, are not good at making accurate predictions

NB

"...I will never get to know the unknown since it is unknown. However, I can always guess how it might affect me, and I should base my decisions around that......the probability of a very rare event is not computable; the effect of an events on us is considerably easier to ascertain......we can have a clear idea of the consequences of an event, even if we do not know how likely it is to occur......this idea that in order to make a decision you need to focus on the consequences (what you can know) rather than the probability (which you cannot know) is an essential idea of uncertainty..."

Nassim Taleb, 2007

People like Warren Buffett (Barrie Dunstan, 2009) observed that the lesson learned from experience is that we learned nothing from experience!!!!!!! By the time the lessons are needed, a new generation has either forgotten them or not been taught them. Furthermore, this leads into the phenomenon "creeping determinism", ie

"...the sense that grows on us, in retrospect, that what has happened was actually inevitable - and the chief effect of creeping determinism...... is that it turns unexpected events into expected events..."

Malcolm Gladwell, 2009

The GFC has discredited most mathematical models that endeavour to forecast the future, especially those that used the activities of the past to predict the future and have not incorporated the psychological elements of human behaviour in decision-making. This is linked with the rational-irrational dichotomy and optimistic-pessimistic distinction. We tend to swing from irrational-pessimism, ie doom and gloom, to irrational-optimism, ie exuberance that encourages uncontrollable speculation and risk taking. Furthermore some of the assumptions are not valid, eg most macroeconomic frameworks have treated institutions, like Fannie Mae and Freddie Mac, as neutral. Based on what has happened in GFC, these institutional frameworks are far from neutral in their impact. Fannie Mae and Freddie Mac handle around 50 percent of all mortgages in the United States and they got into financial strife during the GFC.

Some more thoughts on the deficiencies of conventional, traditional financial modeling, ie

i) markets are not efficient despite the use of frameworks around the efficient market hypothesis (EMH), such as capital asset price model, the Black-Scholes option pricing model, modern risk management techniques, market-to-market accounting, market cap indexing, concept of shareholder value. Even the US Federal Reserve Bank fell under the spell of "markets know best".

ii) evaluating relative performance, via the use of, for example, alpha and beta (active return and market return), etc results in managers' tendency to over-diversify because of their fear of underperforming against the benchmark. The aim should be to maximize total returns after-tax and should be to maximize rather than to benchmark. The only way to produce a superior performance is to do something different

iii) this time it is different - remember that no one has the ability to predict the future with accuracy. Furthermore, a behavioural bias that we can influence the outcome of uncontrollable events, interpreting information in a way that supports self-interest and with a common focus on the short-term, will provide an illusion of control

iv) valuation matters - indicators such as price/earnings ratio are useful, eg buy stock when ratio is low, and sell when ratio is high. Need to keep emotions and sentiment away from decision-making as that swings from greed to fear

v) adopt a disciplinary approach - be patient and wait for the best time to buy rather than chasing every swing

vi) be careful of leverage - it can turn a good investment bad.

vii) complex mathematical models can hide the real risks - this results in obsession with needless complexity, ie

"... mathematics is......considered as producing precise and dependable results: but in the stockmarket the more elaborate and abstruse the mathematics, the more uncertain and speculation are the conclusions..."

Benjamin Graham as quoted by Barry Dunstan, 2010

viii) macro picture matters - need to understand macro and micro approaches

ix) cheap insurance - this is useful in a portfolio as it protects us from the known unknowns, such as inflation, bad monetary policy, poor government decisions, etc

x) most models are based on assumptions that are generally assumed to be fixed and stable. If these assumptions are taken to be random and/or changeable, most conventional models are of limited use. For example, the notion of competitive advantage, ie countries should focus on what they do best. Yet if commodity prices fluctuate, this competitive advantage might no longer be advantageous. Furthermore, the notion of competitive advantage is a basis for globalisation, ie efficiency, but in reality the systematic imperfections can distort this.

Another example is the attitude to debt. A positive attitude towards debt implies confidence in the future and a high degree of reliance on forecast. Yet

"...forecasting is harmful since people (especially governments) borrow in response to a forecast (or use the forecast as a cognitive excuse to borrow)...... borrowing makes you more vulnerable to forecast error..."

Nassin Taleb, 2010

Need to be careful of the concept of socialisation of losses and privatisation of gains. Alternatively,if a bank needs to be bailed out it should be nationalised; otherwise banks should be free, small and risk-bearing.

Furthermore, according to Taleb (2010), we

- need to make sure that any incentive or bonus system includes a disincentive for poor performance. Currently incentives system are asymmetrical, ie reward positive performance but no disincentives for poor performance.

- the complexity from globalisation and highly networked economic life needs to be countered by simplicity in financial products. Most complex financial products (eg hedging products) are not fully understood, and as a result should be banned

- governments should not need to restore confidence; the system should be robust enough to handle adverse rumours

- need to be careful of using leverage to handle our debt crisis as it is not a temporary problem; rather it is a structural one

- the market is not the final arbitrator

- need to look at converting debt into equity, marginalizing the economics and business school establishments, banning leverage buyouts, reducing the bonus system, reducing risk-taking amongst bankers, educating people to handle uncertainty and not allowing organisations to become too big to fail

The GFC highlighted the concept of some financial institution being 'too big to fail' (The Economist, 2013a), ie

- GFC started in USA (2008) with Lehman Bros going bankrupt & Barclays buying its US operations; Merrill Lynch absorbed by Bank of America; AIG & Citigroup bailed out, etc

- spreading to European economies, ie Greece, Spain, Ireland, Iceland, Italy, etc

- Citigroup accepted US$143 b loan losses; Deutsche Bank raised US$3.8 b

- bank revenue fell by 1/3 (about $100 b); staff pay fell; employment plunged; more complicated regulation, ie limit bonus payments & hold more capital

- by 2013, European banks were suffering, ie UBS, Credit Suisse; with US's JPMorgan Chase, Goldman Sachs & Citigroup dominating in Europe

Mother Nature is a complex system that has developed ways to handle the unknowns. It is a

"...webs of interdependence, non-linearities and a robust ecology (otherwise it would have blown up a long time ago)..."

Nassin Taleb, 2010

- it has developed backups, eg in the human body we have 2 eyes, 2 lungs, 2 kidneys, etc. These backups are insurance, eventhough there are obvious inefficiencies in costs and energy usage in maintaining these spare parts.

- does not like over-specialization as it limits evolution and weakens the system

- works against largeness. For example, if one removes a large land animal like an elephant, the whole eco-system does not collapse. Yet the fear that one large bank failure (Lehman Brothers) could bring down the entire system was obvious in 2008.

- robustness is important as we are unable to correct mistakes and eliminate randomness from social and economic life. The challenge is to confine this like Nature does.

 

Search For Answers

designed by: bluetinweb

We use cookies to provide you with a better service.
By continuing to use our site, you are agreeing to the use of cookies as set in our policy. I understand