403
RISK AND MANAGEMENT
Shocks and Paradigm Busters (Why Do We Get Surprised?) Gill Ringland, Martin Edwards, Les Hammond, Barbara Heinzen, Anthony Rendell, Oliver Sparrow and Elizabeth White
Introduction Change, like death and taxes, is always with us. However, in most industries the pace of change is accelerating, even to an extent where some observers have begun to question if it is possible to take a proactive stance in such an environment. In short, they argue that it is only possible to respond to the immediate circumstances and that it is not useful to seek to plan a long-term strategy. They would claim that the previously accepted wisdom that `strategic planning is essential for success' is no longer valid in this context. Despite this view, many organizations are beginning to recognize that in order to establish competitive advantage it is necessary to have command of the correct skills and competencies that they foresee will shape the market conditions in which they operate. The time taken to acquire and develop these necessary skills is increasing. This effectively means that it is not only essential to begin developing these skills at the appropriate time, but also that the recognition of which skills to develop is crucial. This means that superior strategy relies on an organization's ability to develop a `clearer view of the future' than its competitors. Strategic planning is even more important to achieve competitive advantage, through choosing the skills to invest in and the competencies to develop. However, in a world of growing complexity this is no longer an easy task. The range of options and alternative futures which face organizations today is much greater than those facing ®rms in the period stretching from perhaps the 1950s to the 1970s. These result from technologically-driven changes in industry structures, regulatory in¯uences which Pergamon www.elsevier.com/locate/lrp
PII: S0024-6301(99)00053-9
The authors use a range of sources of data to review anticipations of the future made at various times in the past century. They conclude that the projections of the future suffer from four basic assumptions that are not generally valid. These relate to the roles of government, the individual and technology, and the concept of progress. They conclude that forecasts today will probably suffer at least to some extent from these assumptions, and ask whether there are new and different paradigm shifts that we are failing to anticipate. The article concludes by suggesting some ways in which organizations can improve their ability to anticipate the future. # 1999 Elsevier Science Ltd. All rights reserved
have blurred boundaries between hitherto clearly distinct markets, and the changing nature of work and leisure (to name but three forces of change). In this environment, making correct assumptions about future trends is dif®cult if not impossible. One approach that organizations have adopted to assist them in these areas is to work with scenarios. Scenarios are divergingÐbut plausibleÐviews of the future: to quote Michael Porter, ``an internally consistent view of what the future might turn out to beÐnot a forecast, but one possible future outcome''.1 They differ from forecasts in that they explore possible futures rather than predict a single point future. Figure 1 compares forecasts and scenarios, and points out the limitation of single point forecasts in times of uncertainty. Long Range Planning, Vol. 32, No. 4, pp. 403 to 413, 1999 # 1999 Elsevier Science Ltd. All rights reserved Printed in Great Britain 0024-6301/99 $ - see front matter
404
Forecasts are over-precise Range Today
of
Trends
Uncertainties
Scenarios explore the range
Timing?
FIGURE 1. Forecasts are points; scenarios explore ranges. The use of scenarios is becoming more widespread with organizations seeking to improve their `visioning' of the future (for instance, with the UK Government through its Foresight initiative2). The Royal Institute for International Affairs has been working since 1995 on developing a picture of how life might be in 2015. This has been done via the Chatham House Forum, a research programme sponsored by a number of private sector and public sector organizations. The organizations provide not only ®nancial sponsorship but also the participation of senior planners or strategists, who take forward appropriate projects together with the Royal Institute staff and their large database of intelligence. This article is based on one such project undertaken by the authors as part of the Chatham House Forum. We had taken part in the development of the Chatham House Scenarios3 and been conscious of the fact that we had certainly brought our own frame of reference to these. So we looked for ways of exposing what seems to be unanticipatedÐsystematicallyÐby many futurists. We did this by looking at various forecasts made in the past, and comparing them with the outcome seen today. What we looked at were systematically under- or over-estimated trends, no matter how wide or narrow the group, searching for cultural factors. This differs from the Japanese work4 which, based on 25 years of their Foresight activities, found that groups of experts in a technical subject made poorer predictions than a wider based group. We found a helpful framework for our thinking in the surprising arena of military history. The sharper `win/lose' environment of warfare allowed Cohen and Gooch5 to propose that organizations may `lose'Ðliterally in the case of warÐthrough failing to learn, to anticipate, or to adapt. Analysis of over 20 scenario projects reinforced this view: we saw `not looking at the future at all' (failure to learn) and `looking in the wrong place' (failure to anticipate) as major causes of scenarios which proved inadequate or unhelpful to the comShocks and Paradigm Busters
missioning organization. These failures were those of being wrong in detail or in failing to see major new trends emerging. This was distinct from the very large proportion of scenario exercises where failure to adaptÐto be able to act appropriately when a scenario became imminentÐwas observed in spite of the existence of the scenarios inside the corporation or organization. However, we decided not to study at this stage the crucially important topic of how to make organizations more able to adapt, and instead to focus on the ways in which organizations `looked in the wrong place'. There is a large body of work emerging on new organizational forms which support adaptation. For instance, Arie De Geus6 discusses the evolution of Shell through several decades and draws conclusions about the `learning organization' of the future. Similarly, Peter Drucker's classic work7 discusses how managing today needs different methods from those that most of today's managers were trained in. The question we focused on was: what are the systematics of visions of the future and their connection with actual development? We looked back at examples of several types of forecasting to see what happened after the forecasts were made. We looked at history, at science ®ction, at some major forecasts in the public domain, and at examples relating to the take-up of technology. One clear point to emerge was that some assumptions were common to many forecasts that turned out to be wrong. We believe this list should be useful to planners in general. After all, most organizations run an annual budget cycle, which is necessarily based on assumptionsÐalthough these may be explicit or implicit. Shocks and paradigm changes can change the validity of these assumptions overnight and so knock plans sideways in the short or medium term, or if the shock is sudden enough, make the budget for the current year dif®-
Timescale
SCENARIO PLANNING
BUDGETS
Shocks
Shocks
Shocks Distance from organisation
FIGURE 2. Shocks and paradigm busters.
405
Lessons from the Military Even competent organizations fail to forecast correctly. Eliot Cohen and John Gooch, in their book Military Misfortunes,5 analyse a number of occasions on which competent military organizations have failed. They see and reject clearly two syndromes. They reject `The Man in the Dock', where the person at the top is ®red for being at the top at the time of disaster, and `The Man on the Couch', where the blame is placed on the psychology of the people drawn to and promoted into top positions. Cohen and Gooch suggest instead three kinds of organizational failure: . failure to learn; . failure to anticipate; and . failure to adapt. They use a number of examples to show that one failure is damaging, two together create a very serious situation, and all three together almost always produce catastrophe. As an example of failure to learn, they cite US anti-submarine warfare in 1942. Even before entering the war, the US military studied the anti-submarine warfare experience of the British. However, their assumption was that the problem was a technological one. Only later did they realize that they needed to learn from the British experience of organization and operations, as well. Israel showed a failure to anticipate in the period before the War of Yom Kippur in 1973. Israel relied on the assumption that Egypt would not attempt another war of conquest, and failed to see that President Sadat might attempt a quite different kind of war, one designed to in¯ict just enough damage to restore Egyptian self-con®dence. In 1915 at Suvla Bay on the Gallipoli Peninsula, the British Army demonstrated a sad failure to adapt. Successful with a surprise landing, the British applied their recent experience of defensive operations, and did not move rapidly inland, failing to see and take the opportunity of a great victory. In 1940, the French Army and Air Force experienced all three failures, leading to catastrophe.
Although the French had good information about the German blitzkrieg in Poland, they did not learn its lessons. They assumed they would have to ®ght a World War One type war of position, failing to anticipate the new German war of movement. When the crisis broke, they failed to adapt. Although some leaders began to learn and adapt in a hurry, the Army could not change its ideas and methods quickly enough, and ran out of time. Cohen and Gooch's analysis helped us to frame our questions about scenarios and forecasting, and we decided to concentrate on failures to learn and to anticipate.
Failures to Learn and Anticipate Based on nearly ®fteen years experience with scenario planning, one of the authors (Barbara Heinzen) has developed a list of characteristic reasons why organizations fail to learn or to anticipate the future. Dr Heinzen has called these a failure of `forecasts', using this word in its general sense of foreseeing the future. Believing what we want to believe.
Believing What We Want to Believe, and Not Paying Attention
The historical data on oil drilling in the US (see Figure 3) showed that oil drilling activity had been growing for a number of years and, naturally, oil drilling business had expanded in that time. When forecasts of the future were made, including `high', `medium' and `low' activity, they simply re¯ected the belief that growth would continue. The forecasters believed what they wanted to believe. However, in fact oil drilling collapsed soon after this forecast, because there was a change in the US tax laws, which effectively reduced the ®nancial incentives to drill. Could the forecasters have foreseen this change in the law? Given the number of lobbyists
level of activity
cult to achieve if not meaningless and irrelevant. Examples like the fall of the Berlin Wall or a typhoon hitting a factory may be wholly unpredicted, or expected actuarially, but in either case it is likely to affect the current year's budget, as suggested in Figure 2. The purpose of this article is therefore to share what we learnt from this analysis and to highlight how organizations might improve their ability to anticipate the future.
historical data
Believing what we want to believe
What really happened: a failure of attention 1965-1992
FIGURE 3. Oil drilling activity in the USA: foreseen versus actual activity (drawn from memory). Long Range Planning Vol. 32
August 1999
406 inhabiting Washington DC, early signs of a change in law were probably available, but there was a clear failure of attention. The graph in Figure 3 therefore illustrates two failures: . believing what we want to believe; and . failing to pay attention.
The Tyranny of the Present
Another reason we have trouble foreseeing what will happen in the future is that our views of the future are always coloured by our most recent experiences. This is well illustrated in the graph shown in Figure 4, but was also apparent during consultancy work carried out in Asia in 1996, when Dr Heinzen repeatedly asked workshop participants: `What could go wrong in Asia? What will cause growth to slow? Could we have a ®nancial crisis in Asia?' After 20 years of growth and stability, the universal answer in 1996 was `that is not possible'. This is another example of the tyranny of the present.
Asking the Right Questions
level of demand
One of the ways to get around the tyranny of the present is to ask a contrary questionÐsomething that forces us to think differently about the present. The right questions, though, are often hard to ®nd, since they tend to appear only when we look away from what everyone is saying about a subject and ®nd some empty space we cannot explain. These empty spaces are fruitful sources of the right questions which need to be asked if we want a good view of the future.
Overestimating Our Ability to Control the FutureÐ`we Can Handle It'
Another reason we get our forecasts wrong is that we assume our organizations are strong enough to cope with change. In two assignments with very different but con®dent companies, the working groups wrote descriptions of the world their companies would be facing. They were complacent viewsÐworlds in which the companies were bound to succeed and overcome any obstacles. When asked what name they would give to this future, each group responded, ``We can handle it.'' This belief that `we can handle it' essentially makes forecasts unnecessary, so only a very quick and super®cial look at the future is likely to be undertakenÐand it is also likely to be wrong.
The Need to Present a Point of ViewÐthe Example of HIV/AIDS
When the AIDS epidemic was ®rst discovered, doctors and epidemiologists realized that they were facing an incurable disease that could spread for many years, infecting many people long before illness was visible. They also knew that there were many uncertainties around the spread of the disease that would make it genuinely dif®cult to know how far the virus would spread through any particular population. Equally, the experts quickly learned that the spread of HIV could be slowed down by using condoms, practising safe sex, sterilizing needles and blood supplies, and so on. However, this meant persuading many people to alter their behaviour. To help convince people to change their behaviour, the uncertainties around projections of the spread of the disease were underplayed and risks were highlighted. Later, when the disease did not spread as anticipated in several cultures (as in the UK, for example), the risks remained, but the overdramatization of the spread of the disease made many people question whether the epidemiologists were right about the need to change their behaviour. In this case, the very real public health need to present a point of view forced the forecasters to develop a highly dramatic case that minimized the equally relevant uncertainties in their forecasts.
The Unreliability of ExpertsÐor the Value of Innocent Eyes
historical data
Using the present to guide the future
1965-1992
FIGURE 4. Estimates of oil demand: foreseen versus actual activity (drawn from memory). Shocks and Paradigm Busters
In the mid- to late-1980s Dr Heinzen was working with a company on the future of Japan. At the time, she was struck by the mindless repetition from business people that `Japan is different'Ða cultural explanation for dif®culties her Western business clients were discovering. The most frequently cited aspects of difference were `consensus is important in Japan' and `the Japanese take the long-term point of view'. As the project proceeded, she began to wonder what the basis of consensus was and why it
407 was upheld. She also wondered how the Japanese paid for the long-term view and discovered a number of very interesting rules in the ®nancial system that made it possible to support long-term business strategies. So she organized two meetings of experts. In one meeting she asked people to discuss the question `When will the consensus in Japan break down?', and in the other meeting she asked `What will happen as the Japanese ®nancial system opens to the outside world?' She was given two clear answers: `The consensus won't break down; Japan is different,' and `There will be a smooth convergence of the Japanese and international ®nancial systems.' Neither conclusion has been supported by events since the late 1980s. So why were the experts wrong? Dr Heinzen's own view is that experts become captured by their subject and, particularly when they are expert in another culture, can take on the myths and beliefs of that culture, making it more dif®cult for them to see the weaknesses that are there. As a non-expert, she did not question their conclusions, but it has since turned out that her `ignorant' questions anticipated the future more closely than the expert views. While expert views are incredibly valuable, this kind of experience has led several of us to reassess `the value of innocent eyes' in forcing us to question the automatic conclusions experts often offer.
Time to Do a Good Analysis
One of the reasons we rely on experts is that very few organizations take the time or devote the resources needed to collect and understand the relevant facts. Deadlines are tight, staff are already overworked and there is no budget for commissioning outside research. Even when the budget is there, research that helps us to understand the future tends to ask different questions from those asked either by operational people or by academics. As a result, ordinary research skills are often not enough, while the skills involved in researching what will happen in the future are scarce and undervalued. As a result, good analysis is just not done. The strongest example of this came during 1996 when there was a clear need to gain a better understanding of Asian ®nancial systems, but none of Dr Heinzen's clients that year had the budget, time or skills to undertake such work. They simply lacked the time to do a good analysis.
Assumptions and the Illusion of Certainty
Assumptions about the future are intrinsically necessary. We must be able to assume that the ground will be under our feet before we take a single step. And yet, our assumptions can cause us trouble because they lie deeply hidden in our beliefs and behaviours. Where the assumptions are smuggled into our views of the future, they can dis-
tort the forecasts we make. They can also conceal ignorance. In one training exercise working on the future of rural Scotland, the group assumed for two days that people in rural Scotland worked as ®shermen, farmers, foresters and in other rural activities. However, when we looked at employment and government expenditure, we discovered that in fact over 50% of the population was directly dependent on government moneyÐeither as unemployment bene®t, pensions or salaries. It is examples like this that show how our assumptions and their illusion of certainty can lead our forecasts astray.
A Question of Timing
Another frequent source of error is making an accurate prediction about what will happen in the future, but getting the timing wrong. For many people the Club of Rome's `Limits to Growth'8 was completely wrong because their predictions were not ful®lled when the authors said they would be ful®lled. But has that made those predictions wrong? We doubt it. Instead, it is a question of timing.
`It Takes 30 Years to Get a Good Idea Accepted'
Even when we get everything rightÐthe right questions, good analysis, the right timingÐour forecasts may still not be good because they cannot gain acceptance. They simply do not penetrate. One of the interesting things about working in this ®eld for the past 15 years is seeing how long it takes for ideas to take hold among groups. There seems to be an instinctive rejection of a novel view of the future. This came home to Dr Heinzen when Tony Allan was giving a talk about importing wheat into the Middle East, which he described as `virtual water' since the water needed to grow wheat is embodied in the wheat imports. Thirty years ago, the very idea of importing wheat into the Middle East was rejected because countries believed they needed to be self-suf®cient in food production, even though they did not have enough water to meet this objective. Now importing wheat is accepted practice, hence Tony Allan's conclusion that `it takes 30 years for a good idea to be accepted.'
Obedience Versus Curiosity
Another reason our forecasts go wrong is that we want to write a good diagnostic view of how the world works and where it might be going, but worry that such an analysis will not be accepted by our bosses. We therefore water down and temper our conclusions. Dr Heinzen was particularly aware of this tension while working in Asia where there is a strong culture of obedience and conformity. In this culture, curiosity can be confused with disloyalty. But this is not a problem limited to Asian cultures; Long Range Planning Vol. 32
August 1999
408 it exists in many Western corporations where people are promoted based on their ability to echo, rather than question, the views at the top. Hence, many of our forecasts are wrong because of this tension between obedience and curiosity.
What We Don't Know, We Don't Know
Finally, there is a very good reason why we get our forecasts wrong: our knowledge is very limited. The chart in Figure 5 is from Don Michael, the author of Planning to Learn and Learning to Plan.9 It is a metaphor for understanding the importance of this circle of knowledge and ignorance. The fact is we are constitutionally unable to know all that it would be useful to know. Even if we include our knowledge of what we don't know, most of what we need to know is outside our apprehension. We don't even know it is there. That means that we are always making forecasts in a state of ignorance and uncertainty. Inevitably, some of our forecasts will be wrong, even if we have done everything else on this list in the right way.
Why Organizations Fail to Anticipate the Future
The Chatham House Forum adopted as its signposts the following summary. . Not paying attention. . Losing the important messages in the `noise', unless there are `®lters' to keep out the unimportant data. . The tyranny of the presentÐa mental model so ®rmly held that the mind cannot believe that anything different is possible. . Organizations believing what they want to believeÐ`the credibility of large investment', where it is easy to believe something that has been developed as a result of a lot of effort and so must be a ®rm foundation for the future. . The experts' model is impervious to new thoughts, and well able to reject any contrary view.
WWDKWDK: what we don't know we don't know
WWKWDK: what we know we don't know
WWDKWDK
WWKWDK WWK
WWK: what we know
FIGURE 5. What we don't know.9 Shocks and Paradigm Busters
. Not understanding organizations' own assumptions, which they can easily believe are facts rather than just assumptions. . Putting the wrong weights on the known issues. . A culture of obedience and orthodoxy can prevent ®rms from asking the right questions. . Overestimating the ability of the organization to deal with whatever events might arise. . Failures in organizational learningÐgaining knowledge per se is not suf®cient, since not until it has been absorbed by the organization is it considered to have been `learnt'. Not only large organizations but also departments, teams and individuals are vulnerable to these factors. We tested these common sources of error against some selected `shocks' such as the Barings collapse, the Asian ®nancial crisis, and the Maxwell fraud. This showed that in all cases, the signs were visibleÐand even widely commented on in the mediaÐfor a signi®cant period before the `unexpected' or even `unthinkable' did in fact happen. People in the organization assumed things were going along satisfactorily, perhaps preferring not to know that they were not. It became clear that failures like `not paying attention' and `the culture of obedience' are important, and these should be easy for management to overcome. However, even if the organization does have a function speci®cally tasked with looking at the competitive environment and takes steps to dispel the tendency towards a culture of obedience, it is easy to miss paradigm changes and shocks. The ability to put a lot of effort into planning and still `not ask the right questions' is a key to many failures. And the same reasons might prevent organizations from foreseeing huge successes. For instance, Virgin moved ahead in a way the ®nancial services and airline worlds were unable to forecast because of assumptions about a pop music label boss who wore a jumper to work. Surely nobody would buy ®nancial services from him? Or trust him to run an airline? Scenario planning can help these thought processes, by emphasizing that a range of possible futures should be considered. In this environment, which is non-judgemental, even unlikely but possible paradigm changes which affect the business, such as the oil price rise shock of the 1970s, can be considered analytically. In most scenario planning methodologies, wild cards can be usedÐelements which are independent of particular scenarios, and could occur under any scenario.10 In trying to anticipate shocks and paradigm changes, individuals try to do two things: improve their con®dence in the current `known' domain, and extend the range of events which they can include and prompt themselves to think about. This is a
409 search for the `truth', but we can never forecast perfectly, as the future is forever unknowable. But scenarios help us to explore ways it might turn out. If we can improve our ability to anticipate shocks and paradigm changes we can extend the area where our forecasts are likely to be more reliable. Our annual budgets will be better founded and more stable, too.
Learning From Previous Forecasts Science Fiction
Long before scenario planning was heard of, H. G. Wells was visualizing different futures based on scienti®c progress. As Martin Edwards points out,11 the predictions of Wells in the 1890s can tell us a great deal about predictions of the future and the environment in which they are made. Wells was a commercially successful author who tailored his publications to meet the preoccupations of his readers in late Victorian England. His predictions were consciously based on the `rule of three' technique, gathering the trends and inventions of contemporary society, as he saw them, and extending their development into the future. It is hardly surprising, therefore, that some of his predictions carry in retrospect an uncanny accuracy; Wells lived in a period of rapid innovation and the technology in his novels (telephones, cars, aeroplanes) had either already been invented, or was being discussed. Other predictions, however, reveal just how contemporary Wells's predictions were. His concerns with class, the con¯ict between capital and labour, and the merits or dangers of government through an enlightened, rational elite place him ®rmly in the mindset of late Victorian social commentators and reformers. The predictions of Wells are fundamentally a way of accessing, understanding and interpreting his peculiar present. The genius, and limitation, of Wells was to grasp the innovations of his time and to realize that they would occupy part of the centre-stage of the future. But while his time machine remained only a device of ®ction, he was unable to see the whole. Many other science ®ction writers have explored how new scienti®c and technological knowledge might affect our lives. In general, it seems that technology forecasting has a better record than forecasts about human behaviour, which remains various and often quite unpredictable. There has also been a tendency to overestimate the capacity of governments to do things, or even to see the need to do certain things. What has often been underestimated is the capacity of people acting as individuals or in small loose groups to do things, relying on their own common sense.
Forecasts Aimed at Public Attitudes
Similar assumptions were visible in Herman Kahn's The Year 2000: a framework for speculation in the next thirty years.12 Some of the 100 things he expected to see by the year 2000 were: . . . .
underwater cities, the use of the moon to replace street lights, the possibility of personal pagers, and computers in business.
While he overestimated the potential of governments to implement big projects, he underestimated the paradigm change arising from the technology changes based on semiconductors. He was also part of the movement trying to get the public to think about the unthinkable. While much of his focus was on the effect of nuclear warfare, he was also part of the Club of Rome group. In 1972 the Club of Rome published its book The Limits to Growth.8 It forecast growth of population and industrialization, along with pollution and the depletion of resources, leading to collapse and catastrophe within a century. Exponential growth was the main focus of attention, and probably the biggest error in the analysis. At that time, ideas of an ever increasing population and the exhaustion of resources were popular to the point of being unchallengeable. The forecast did not foresee the exploration of deep sea beds ®nding more oil, or that ef®ciency in fuel use would reduce the rate of use of fuels, or that the use of steel would decrease, reducing pollution. The forecast did not foresee the reduced size of families, with increased education of women leading to zero population growth in many countries. The authors proposed an international grouping to deal with the problems as they saw them. It is instructive to see how deeply embedded in the period of government big science the forecasters were, and how dif®cult it is for forecasters to reject orthodoxy. At any time, there is orthodoxy, a dominant logic, which controls our perceptions of reality. It is promoted by different forecasters who have studied the same body of information with much the same set of themes and values in mind. The orthodoxy seems obvious, and becomes self-reinforcing. Research grants are available for carrying the orthodoxy further, but not usually for overturning it.
Forecasts of Technological Change
Steven Schnaars has studied an orthodoxy that he calls `the myth of rapid technological change'. He notes13 that in the 1960s, for example, tremendous change was forecast for the way transportation would develop, including commercial passenger rockets, VTOL and supersonic planes, automatic veLong Range Planning Vol. 32
August 1999
410 hicle separation on new `smart highways', and the use of nuclear power in all forms of transport. The people who made these forecasts now seem to have been enthusiasts enamoured of technological wonder. They went wrong because they fell in love with exotic technologies just because they were exotic. It was easy for them to believe what they wanted to believe. They also failed to pay attention to the less romantic matters of commercial fundamentals. Many of the ideas were simply too expensive to be practical. They made some incorrect assumptions about human behaviour as well. Consumers might have agreed that they wanted better mass transit systems, but few were happy at the idea of sitting behind a nuclear engine in a computer-controlled bus. On the whole, they did not respond enthusiastically to CB radio, or to quadraphonic sound systems. Not all technology is wanted merely because it exists. Some ideas have come to fruition, though not very quickly. The fax machine is an example of timing and complexity weakening an essentially correct projection. Quick uptake was predicted, but initially it was too expensive and took too long to transmit a document. Eventually, 20 years behind plans, it achieved a mass market through improvements in price and performance. It is easy to see now that the microwave oven was always a good idea, but it achieved success 25 years later than expected. It was only with changes in lifestyleÐwomen workingÐand improvements in ready and frozen meals to gourmet status, that the ovens proved to ful®l a useful role. The technologists worked within the strongly held assumptions of their times. For example, in the 1960s, when the theme of space travel was popular, many different forecasters predicted manned bases on the moon. That theme died away, and in the 1970s the energy crisis became the dominant theme, and one assumption was that nuclear energy must certainly be the solution. Today's orthodoxy is that nuclear power is dangerous and unmanageable.
Mechanisms for Creating Orthodoxy
At any time, the current orthodoxy has two potential sources. One is that the way we do things is obvious and right, and that extrapolation of our present behaviour and knowledge will give us a valid picture of what is going to happen. The second is a kind of collective wishful thinking, which accumulates layers. For instance, we want to believe in an attractive idea, so a momentum develops, and we ®nd reasons for believing, and do not look for reasons to doubt, especially if the end is a noble one, like saving whales and rainforests, or powering cities with wind energy. The media reinforce current orthodoxies. Having discovered authorities in any ®eld, the media return Shocks and Paradigm Busters
repeatedly to the same people, reinforcing their standing by their frequent appearances, and hence their opinions. The media also write and speak in a set of clicheÂs and stereotypes, which contribute to a climate of thinking in terms of right/wrong, good/ bad, without discriminating between circumstances. The media reinforce the current orthodoxy until boredom and the possibility of a dramatic overturning of the orthodoxy in favour of a new one starts a new cycle. Even the most solid organizations, which carefully question their own policies, are not immune to this outside in¯uence. We need to remind ourselves to be active in challenging the received wisdom that comes through the media. We tend to let information about the world come to us, but we should to go out actively to see if there are other facts and opinions. We need to develop our own diverse sources. When did the orthodoxy on the Internet changeÐ from `a hackers tool' to `an essential help to commerce'Ðand did we forecast its development?
Four Systematics We suggest that four sources of errors in forecasting emerge from these analyses, and that checking for sensitivity to these four may help organizations to improve the quality of the assumptions they make about the future.
The Individual is Unboxed
The ®rst is that planners' assumptions about the behaviour of people, which may have been accurate in previous decades, are certainly not right in the current world. The basic framework of a hierarchy of needs, starting with meeting our basic needs for food, clothing and shelter, and moving on to needs for self-expression and self-actualization, should warn us that people widen the range of choices which they make once food and shelter needs are met. And since today most people are not prompted by memories of hunger or cold, people's behaviour becomes increasingly dif®cult to forecast. The common reason for the failure of a number of forecasts, particularly the technology-driven ones, is that people were more sensible and capable of adapting than the forecasters or planners expected. This can cause paradigm shifts and shocks to occur overnightÐnot just the change of attitude to the wearing of baseball caps, but in very major ways such as the fall of the Berlin Wall.
Government Cannot Do It
The second source is the major political and military paradigm shift, caused by the comparative retreat of governments. Many Western governments
411 are trying to withdraw from the approach they took in the post-war period. Partly this is because their ability to control their environment decreases, as ®nance moves around the globe more easily, large movements of guest workers and immigrants continue, and technology makes the international transfer of ideas faster and more copious. At the same time, the public's demand for Government services constantly increases, rather than diminishes. While privatization satis®es some expectations by replacing the government in supplying services, demographic and employment pressures reduce governments' ability to ful®l their post-war role. In the bipolar world of the Cold War, the effort by the United States to stay ahead in technology meant that Government funding of development was large and assured. This resulted in a stream of spin-offs for civilian and commercial exploitation. Now that the Soviet threat has disappeared, funds for research and development have been reduced. The main drivers for technological change must now come from private enterprise. Will the sources and types of technological advancement therefore be harder to forecast? The effect of this paradigm shift is very deepseated; many forecasts assume that the role of the government will be signi®cant.
Technology Will Be Used If It is Useful
The third source of common error is in the timescales of the adoption of technological innovation. Often, the nature of a development is forecast correctly, but the timing is over-optimistic. A good idea attracts enthusiasts who assume that consumers will be equally keen. Forecasting the timing of crucial developments requires an understanding of the other components that are needed to form a total system. For example, computer hardware needed a popular standard operating system before mass PC use could take off. An important lesson is that a forecast which does not materialize in the expected timescale might not be wrong in its essentials, only in its timescale, so it should not be discarded too quickly. The other components may come from totally different ®eldsÐas in the case of the microwave oven discussed earlier. The question we kept asking ourselves was: who would want one of these and what would they use it for? It provides a useful counterpoint at a time of hype.
Progress
A fourth source of paradigm shift is change in public attitudes. For centuries, up to the turn of this century, Western intellectual thought embraced the idea of continual progress towards greater scienti®c certainty and a more perfect state of being.
Ultimately, everything would be explained, and all problems would have solutions. The experience of the 20th century has disillusioned many, and preoccupations with worries about issues such as pollution, the nuclear threat and ethnic con¯ict have challenged our assumptions about the nature of progress. Now, we do not think that things will necessarily get better. We think we might do well if we can merely sustain things. This loss of optimism is more marked, perhaps, in Europe than in the United States.
Imperfect Forecasts Organizations also need to make better use of the forecasts they make already. A forecast should be seen as an experiment. It is too often seen as a discrete unit, with an end point, when in fact it is only a step along the road, leading to the next forecast. It is a snapshot of the future, as taken now, and there might well be more in a picture taken tomorrow. Organizations can learn more from a forecast, as from any experiment, if they keep reviewing it. Each time, we are trying something out to see how it looks. Even `failures' can be useful if we use them for learning. For instance, Figure 6 represents a forecast, made in 1980, of the use of IT in the home. It is 90% accurateÐmost of the functions are those found in today's multi-media PCs. But it raises a laugh because it looks so old fashionedÐlike an aircraft console, who would want it in their living space? In 1980, the effects of the semiconductor revolution were beginning to be visible, in terms of the reduced size, power consumption and price of computing. A review of this even two years later would have enabled the useful 90% to be extracted from the erroneous assumptions about the hardware used as the delivery vehicle.
How Can Organizations Improve Their Forecasting? In the above sections we have summarized the main sources of error we have found in forecasts. These are important to individuals both at work and at home. In a changing world, individuals ®nd that they are responsible for planning for their own futures, because organizations will not, and governments say they cannot. If individuals can increase their forecasting skills for personal reasons, they can contribute more at work to corporate forecasting. This should be welcomed. We started with the assumption that organizations needed to be able to take a view of the future in order to gain competitive advantage. Long Range Planning Vol. 32
August 1999
412
Continuous Videophone plasma screen Written papers picture of friend Library books Graphics Indexes
Screen positioning controls Microcomputer panel Microphone/ speaker TV camera Writing tablet digitizer
Screen positioning controls Keyboard Voice data input controls abd picture channel connections
paper In/out
Channel controls
FIGURE 6. Computing in 2000? ``The `consumersole', an information console that could be in use in the home by the end of the century.'' Source: Institute for Scientific Information, 1980.14 But although most organizations like to claim that they are open to new thoughts, in practice they behave as though they value orthodoxy among staff. Some people develop unorthodox views anyway, and they should be made to feel they are valued. But we cannot rely on people being brave, and need to actively encourage people to think outside the box. Figure 7 shows the trend in organizations towards asking for the involvement of staff members.
High Tomorrow
Input to organisation strategy
Growing Low
Trend
High
Yesterday
Low Personal identity in a coporate role
FIGURE 7. Who should question assumptions? Shocks and Paradigm Busters
But in many organizations, most people are busy achieving their tasks, and only a few people at the centre have the authority and time to question assumptions. And, although a greater variety of views can help, greater quantity can slow things down, and too many inputs at corporate level can be destabilizing. We need to manage a change to encourage more people to participate in a helpful way. What we want, of course, in both the personal and corporate worlds, is intelligent and correct forecasting. Getting it right implies realizing that in both worlds we carry our sets of assumptions around with us, and we have seen that clinging to current orthodox assumptions is a major source of forecasting error. However, recognizing the impact of current orthodoxy is easier to recommend than to do. We need some more tools to help us do that. Scenario planning is well known to be valuable in encouraging people to think outside orthodoxies, and so are various forms of modelling and simulation. But perhaps the toolbox needs restocking and new methodologies should be developed to lever the bene®ts from what is essentially a less well de®ned, but equally relevant, branch of knowledge management. And ®nally, organizations need to make better use of the forecasts they have already.
413
Conclusion Organizations are always operating on the basis of forecasts, whether the forecasts are carefully prepared and assessed, or whether they consist of a set of common and unquestioned assumptions. Most forecasts get things wrong, and all assumptions are steadily going out of date. This review has highlighted the prevalence of four major sources of error:
This puts a high value on our ability to question and to adapt. In human terms, who is empowered and responsible for asking the questions about our assumptions? How should they do it? And how can we ensure that the right people will take notice? In trying to ®nd answers to these questions, we must remember that of all the elements we might consider in making a forecast, making correct predictions about future human behaviour is the most dif®cult. Gill Ringland is a Group Executive with ICL, and Chairman of the Futures Council of the Conference Board. ICL, Observatory House, Windsor Road, Slough, SL1 2EY. e-mail: gill.ringland @icl.com
. the increasing dif®culty in making assumptions about individual behaviour; . the changing role of government (no longer necessarily the major driver for change); . the problem of timescales which are often overoptimistic; and . the loss of faith in our ability to produce continual progress. References 1. M. Porter, Competitive Advantage, Free Press, New York (1985). 2. DTI/Office of Science Technology, Taking Foresight to the Millennium, URN96/1123, (1996). 3. Royal Institute for International Affairs, Open Horizons, 10 St James Square, London SW1Y 4LE. 4. G. Ringland, Applying scenarios to defining an R and D portfolio, in IIR Strategic Planning Conference Proceedings, October (1998). 5. E. A. Cohen and J. Gooch, Military MisfortunesÐthe Anatomy of Failure in War, Free Press/ Macmillan, New York (1990). 6. A. De Geus, The Living Company, Harvard Business School Press (1997). 7. P. Drucker, Managing in Turbulent Times, Harper & Row, New York (1980). 8. D. H. Meadows and D. Meadows, The Limits to Growth, The Club of Rome, Signet (1972). 9. D. Michael, Planning to Learn and Learning to Plan, Jossey-Bass, San Francisco (1973). 10. G. Ringland, Scenario Planning, Wiley, New York (1997). 11. M. Edwards, The Last Millennium Bug: H. G. Wells and Forecasting the Future in the 1890s, in London in 2020, Gresham College, London (1999). 12. H. Kahn and A. J. Weiner, The Year 2000: a framework for speculation on the next thirty years, Macmillan, New York (1967). 13. S. P. Schnaars, Megamistakes: Forecasting and the Myth of Rapid Technological Change, Macmillan, New York (1989). 14. A. E. Cawkell, Computing in 2000, in T. Forester (ed.), The Microelectronics Revolution, Blackwell, Oxford (1980).
Liz White is Planning Manager at Halifax plc. Strategic Planning Manager, Trinity Road, Halifax HX1 2RG. e-mail: elizabethwhite @halifax.co.uk
Dr Oliver Sparrow is Director of the Chatham House Forum. Chatham House, 10 St James Square, London SW1Y 4LE. e-mail: ohgs@chatham. demon.co.uk
Martin Edwards is a Commercial Manager at ICL. 40 The Strand, London WC2 N5H. e-mail: Martin.D.
[email protected] Les Hammond is Head of Strategic Planning and Development at Halifax plc. Trinity Road, Halifax HX1 2RG. e-mail: leshammond @halifax.co.uk Barbara Heinzen is an independent consultant and a GBN Network member. 13 Gray's Inn, Road, Gray's Inn, London WC1 5JP. e-mail:
[email protected] Anthony Rendell was formerly Controller Strategy and Corporate Affairs at the BBC World Service. 9 Bark Place, Bayswater, London W2 4AR
Long Range Planning Vol. 32
August 1999