12
Long Range Planning, Vol. 15, No. 4, pp. 12 to 21, 1982 Printed in Great Britain
Corporate Uncertain
Planning Future
P. W. Beck, Planning Director,
Dilemma
‘Surely planning can’t be worthwhile in these uncertain times’ is the usual reaction to hearing that one is a corporate planner. People talk about increasing uncertainty, but on what basis? Life is no more or less uncertain than it has ever been. The future was, is and always will be uncertain. What has changed is the general consciousness of uncertainty brought about by feelings of insecurity during a period of major social transition and disruption. There have been many such periods in the past. Perhaps a new problem is that there are now very large systems to contend with. There has been enormous growth-particularly over the last 20-30 years-in the size and complexity of the systems we are dealing with. Two brief examples: + The first is the birth of the Model T Ford. Henry Ford decided in January to introduce a new model of car. He proceeded to design and build it and the car was launched in November or December of the same year. Compare this with the lead times of the new Ford Escort, which took about 6 years of planning. Henry Ford, of course, had the whole system under his personal control. It was a small, closed system which was possible to control in this way. The management
The author is Planning Strand, London WCZR
Director, ODX.
for an
Shell U.K. Limited
The increasing complexity of socio-economic systems has increased the need for planning while at the same time making planning more difficult. Planning in a changing environment for an uncertain future presents the planner with a dual challenge. He must not only identify the forces behind changing circumstances but must also help wean decisionmakers from their dependence on single-line forecasts.
The Planning
0024-6301/82/040012-10?;03.00/0 Pergamon Press Ltd.
Shell U.K. Ltd., Shell-Mex
House,
and control systems are rather different.
in the motor
industry
today
When Bismarck was Chancellor of Germany, it was said that he personally knew most of those employed in the German civil service. This made it possible for understanding of 2: sy~eIndevelop a deep the capabilities involved, and the problems ‘the system was dealing with. He was therefore able to control the system, get the information he wanted from it, and could take decisions based on integrated knowledge. What percentage of the civil service does Chancellor Schmidt know personally-or, for that matter, those in the same position in other developed countries? To what extent do they understand the internal mechanisms of the systems they are dealing with? And to what extent do they consequently understand the basis of the information they extract from the systems? If we go further back in history, information and control systems were even more simple than they were at the beginning of this century. When Nelson sailed from England there was no means whereby the Admiralty could get messages to him before his return-short of sending other ships after him. So Nelson was not subject to weekly, daily, or indeed hourly messages from the Admiralty on what he should do. In the past, with lead times for the majority of industrial activities being relatively short-a year or less-people could generally see the effects of their actions and could learn from their mistakes more rapidly than, for example, the designer of a motor car today. Also, with simpler systems and it was generally easier for the less specialization, individual to see the fruits of his personal efforts in the end product-and to be held accountable for them. In these earlier
systems,
the lines of communication
Corporate were considerably shorter than those of today-and carried much less information. One of the major problems of today’s systems is that the available information can be so prolific that it simply becomes ‘noise’. At this point it ceases to be informative and results in confusion-which in turn results in indecision. So it is complexity which has grown-resulting in confusion and indecision-rather than the uncertainty of the future (though this complexity, and the longer lead times it has given rise to, have inescapably increased uncertainty about the outcome of decisions. Since one has to plan further ahead, there is a greater chance that by the time a particular project reaches fruition, the environment may be different from that in which the decision was made to proceed with this project.) The 1950s and 1960s were marked by a growing belief that the future could be made more certain through increasingly sophisticated systems of analysis, data gathering and processing, leading to The events of the 1970s have greater control. undermined this belief, and thus the future appears to be comparatively more uncertain against this former certainty that the future could be predicted and controlled. But this is simply a case of belief versus reality-the reality being the fact that the future remains as uncertain as it has always been. Meanwhile, the increasing complexity of our socioeconomic systems has increased the need for effective planning. But the increasing complexity of these systems has made planning more difficult. That
is the dilemma
facing
The Development Planning Systems
modern
planners.
of Modern
In the 1950s business planning was relatively simple. There were obvious needs and shortages requiring obvious remedies. Goods and services were in such demand that the planning of facilities to meet the situation virtually arranged itself in a natural order of priorities. All that was required for profitable operation was a source of supply, technological expertise and the ability to market. By the 1960s this had changed; the pressure of competition brought with it the need for more considered choice of facilities and strategies so as to survive such an environment. This was also the time explosion’ was beginning was having an increasing
when the ‘information and when the computer impact.
People began to see the possibility of using the computer to distil the mounting flow of information into those bits that really mattered, thus
Planning
for an Uncertain
Future
13
eliminating ‘noise’ and reducing confusion. This led to the objective of producing forecasts on a range of factors-population growth, economic growth, exchange rates etc.-and coordinating them to produce a coherent view of the future, thus making decisions easier and more effective. What happened, in fact, was that the increasing dependence on the computer had a number of unfortunate consequences in relation to the role and effectiveness of the planning functions. These included : More and more of a planner’s time became taken up with the task of feeding the computer with information. In effect, the emphasis in planning shifted from conceptual thinking to data generation, with the planner’s role increasingly confined to the administrative aspects of collecting and disseminating information. The heavy emphasis on sophisticated mathematical methods in planning tended to concentrate attention on factors that could be readily quantified-at the expense of factors that were not so easy to quantify. Yet it was exactly those less quantifiable factors-relating to broader socio-political developments-that were becoming increasingly important. As the system demanded increasingly large amounts of data, the heart of the decisionmaking process inadvertently began to reside at the more junior, less experienced levels within the individual organization-since these were the levels at which data were acquired and collated. In conjunction with this development was an increasing tendency to overlook the fact that the results produced by the computer are only as good as the information that is fed into it. Provision of information about the future will always be more of an art than a science-and some artists will always be better than others. The quality of the information in the computer can thus vary considerably, but the people providing the information are rendered invisible by the computer-with the result that the element ofhuman fallibility behind its answers is equally obscured. The computer thus produces a false impression of objectivity and/or legitimacy in the answers it produces. The computer world calls this the ‘GIGO’ effect-‘Garbage In, Gospel Out’. These developments were starkly exposed when the 1970s arrived with various economic crises that exposed the nonsense of many previous forecasts and demanded a fundamental reappraisal of existing planning systems. In the course of this reappraisal, the whole basis of modern planning-with its use and production of forecasts-became increasingly suspect. It became
clear that the weakness
in the forecasting
Long Range
14
Planning
Vol. 15
August
approach was not simply the result of a collapse of growth rates and other trend lines brought about by the onset of turbulence in the 1970s-though this certainly magnified the weakness and increased the margins of error. Even when the direction of all the important trend lines had remained relatively consistent during the boom period of the 1960s forecasts had still been unreliable, and planning based on forecasts had led to some disastrous miscalculations-such as that which resulted in the chronic overcapacity in the chemicals industry in the 1960s. In general, however, the reality of the situation tended to be camouflaged by the fact that the rising tide of economic growth and consumer demand helped to justify most expansion plans.
A Look
at Past Forecasts
When looking at forecasts made in the late 1960s and early 1970s one can find many failures, but few successes. Indeed, one may be shocked at the extent to which the most important forecasts and their surrounding assumptions had turned out to be wrong. To mention a few: Oil prices: Prior to 1973, there had been great argument about whether the price would ever go above $2 a barrel. Inflation: There were many learned articles which concluded that the world could not live with an inflation rate of 5 per cent for more than a few years; therefore inflation was going to disappear. Exchange rates: Looking at past fluctuations and forecasts, one could conclude that the best way to forecast the exchange rate is to assume that it remains at its current level-thus producing a median line which perhaps stands a better chance than many forecasts of being somewhere between the peaks and troughs of future fluctuations. Nuclear power: In the 1950s and 6Os, people thought nuclear power would really take off in a big way. Here we are in the 1980s with its contribution standing at a mere 3 per cent of the total energy picture in the world outside communist areas. From this historical perspective, one begins to wonder if any forecasts are ever right. And this leads to a fundamental question: If such important forecasts are so unreliable, what is the purpose of using them for decisions? How can the planning process produce a right answer if all the forecasts fed into it are wrong? The response by Shell planners to these questionsspurred on in the early 1970s by increasing impatience among top management with the
1982 wrong answers the planners were providing-was first to look at the nature of the forecasts they had been dealing with. this analysis revealed a Among other findings, remarkable degree of convergence among different groups of forecasters who were assumed to be working independently. It is common practice in and government planning to both corporate combine and coordinate a number of ‘uncorrelated estimates’-forecasts produced by experts working independently-and then check them against each other and distil them down to their uncommon denominators. In theory, this would provide more ‘balanced’-and hopefully more reliableforecasts. However, in our examination of the history of forecasts, it became evident that there were, in reality, relatively few genuine examples of ‘uncorrelated estimates’-few purely independent forecasts-and the reason for this is that forecasters tend to borrow figures, assumptions and theories from other groups or other common sources. Thus if certain assumptions or component elements are at fault, combinations of these supposedly independent forecasts are simply compounding or reinforcing the error, rather than serving as a corrective counterbalance. The often very high degree of convergence which results from this ‘common pool’ of data has tended (understandably) to be seen as an affirmation of the validity of these ‘independent’ forecasts, rather than as a cause for suspicion. The logic behind this is: If three different groups come up with a similar forecast, it must be right. There has also been a tendency to overlook the fact that if different theories are used to provide different forecasts-a common occurrence in the field of macro-economics-these forecasts cannot be correlated. Many examples of artificial convergence can be found in past estimates of future product demand. For example, one finds that in the 196Os, one company would estimate world requirements for one particular plastic to reach about 1.95 million tonnes by 1980-while another company would put the estimate at, say, 2.4 million tonnes. Considering the vast areas of uncertainty surrounding the question offuture demand for this particular plastic, it seems extraordinary that two ‘independent’ estimates could be so close. Then one remembers that every year there was an international meeting of forecasters at which somebody would give a lecture on that particular plastic-thus producing common assumptions around which the different forecasts were developed.
Types
of Forecast
The Shell analysis identified three different types of forecast which, used properly, serve as a vital strategic tool-but which, ifjust taken at face value,
Corporate can be dangerously misleading. These are the ‘self‘self-fulfilling’ and ‘predetermined’ defeating’, forecasts. The ‘self-defeating’ forecast is related to the concept that if everybody believes something they tend to change their might happen, behaviour and this often ‘defeats’ the forecast. For example, the forecast of a future energy shortage, if believed, leads to action on new energy development, energy savings, etc., which could well result in there being no energy shortage. On the other hand, if such a forecast is not believed, then it might become true, because people’s actions to defeat it would not take place. It appears that many medium and long term projections of supply/demand and economic developments fall into this category. ‘Self-fulfilling’ forecasts are most readily illustrated in relation to short-term movements in share prices and exchange rates. For example, if an influential analyst forecasts the movement of a share price, people start acting on the belief in such a forecast and their action makes it come true. Similarly, if the Bank of England announces that the exchange rate of the pound is likely to drop, the forecast can become self-fulfilling through people acting on a belief in this announcement and trying to get rid of sterling. Momentum if often a significant factor in the self-fulfilling forecast. The movement of a share price or exchange rate, one way or the other, can gather momentum to an extent that it rises or falls to levels which cannot be rationally sustained. In this instance it becomes a spiral of self-justifying expectations-until it reaches ‘catastrophe point’; when the process suddenly reverses itself. ‘Predetermined’ or ‘in-the-pipeline’ forecasts are more straightforward and generally more useful as a basis for project planning-though not without some potential pitfalls. These are based on an analysis of plans and projects that are already off the drawing board. An example is nuclear capacity by the year 1985. Any developments which could add to nuclear capacity will already be ‘in the pipeline’, since the lead time between a decision to build a nuclear power plant and the date of its commissioning is in this example longer than the time frame under consideration. On the basis of developments already ‘in the pipeline’, one can therefore forecast the maximum nuclear capacity for the U.K. by 1985. The actual operational capacity may turn out to be considerably lower as a result of construction delays, unforeseen plant closures, maintenance problems etc. Therefore, even the ‘predeter-
Planning
for an Uncertain
mined’ elements approached with
Future
of the future some caution.
need
15 to
be
There is often a lack of appreciation of the nature and extent of time lags imposed on particular developments. For example, if a revolutionary new form of personal transportation were designed today which rendered the existing motor car potentially obsolete, it would be at least 10 years before it even began to have a significant impact. Plants and facilities for its production would have to be designed and built; the new form has to become politically and socially acceptable and the existing car population has a life expectancy that also has to be taken into account. An awareness of time lags involved in putting new energy schemes together is of vital importance in any approach to energy problems, and it is obvious to those in the oil industry that the general awareness of this factor is very low.
The Danger of Forecasting Decisions have traditionally been based on ‘visceral judgment’-or ‘gut feel’, to use a more colloquial expression-and whatever assistance there may be from increasingly sophisticated analytical and information-gathering systems and techniques, it remains the basis of entrepreneurial decisionmaking. The entrepreneur gets signals from various directions, develops a pattern and uses that for judgment. When the signals become too many and too confused, the pattern is lost and the decision-maker no longer has any solid ground on which to base his decisions. Looking for ways to reduce this confusion, he calls for forecasts from experts in the hope that this will simplify the problem so that he can again use his intuitive reasoning. Unfortunately, single line forecasts provided by such experts-usually well away from the area in which the decision-maker is operating-tend to provide the decision-maker with the equivalent of a straight-line route through a minefield. Of course, the forecaster himself, having studied the subject, may decide that he cannot in all honesty provide a single line. But he may nevertheless be bludgeoned into doing so with the agrument that the decision-maker requires a ‘best estimate’ to take his decision-and if someone has to provide a single line, then the forecaster, knowing the subject, is the best person to do so. If he continues to vacillate, he may be bludgeoned further with the accusation that he is ‘chickening out’. So he puts some forecasts together and distils these into a single set of figures-but hedges these with various ‘ifs’ and
16
Long Range
Planning
Vol. 15
‘buts’. Thus although he has in effect single line, it is nevertheless highly qualifications which indicate that it absolute prediction, and that other possible.
August
1982
provided a dressed in is not an futures are
young economist in our hypothetical example regarding himself as a very small cog in a very large gear train, with the expectation that those higher up in the chain of command are bound to check his work.
But these qualifications may become dissipatedor even disappear altogether-as the set of figures is passed through an organization and perhaps beyond. The wider the circulation, the greater is the danger of the qualifications disappearing-with the result that the set of figures is ultimately accepted at face value, acquiring a false legitimacy. So the qualified, multi-faceted forecast is transformed into an absolute, single-line forecast.
One can see from this example how the heart of the decision-making process can devolve to a point of invisibility and become increasingly mechanistic.
These figures may then be used, for example, as a basis for determining market share or demand, and lead to the setting up of additional plant capacity which may not actually be needed. Hence, the acceptance of these figures at face value can lead directly to a decision which the forecaster who produced the figures may not himself have advocated in view of his acceptance of the uncertain nature of the data. But in producing the figures he has inadvertently produced the decision. To give a more detailed example, let us say there is a project under consideration which is very dependent on exchange rates. The individual responsible for looking at exchange rate developments-who in many organizations may be a relatively junior member of the planning staff, for example an economics graduate 1 or 2 years out of university-does a lot of theoretical work and comes up with a figure of $1.94 to the A by the year 1985. He may surround this with a lot of qualifications with regard to variable factors such as the oil price and the rate of world economic growth. But by the time this figure gets to the chairman of the organization-or, in the case of government organizations, to the responsible Minister or top Civil Servant-the qualifications may have got lost. This may be the result of ‘streamlining’ by intermediaries who are acutely aware that the man at the top is very busy and is therefore likely to be extremely impatient with anything approximating vacillation or longwindedness. The result is that the chief executive is presented with this single figure, which makes the project look a marvellous proposition. So the goahead is given. Unfortunately, while the project may be a great proposition at an exchange rate of $1.94, it may be a total disaster at $2.30. In this instance, who has really made the decision? It is, of course, the young economist who produced the original figure-though he may be shocked to discover this fact. The likelihood of such an eventuality is heightened by the dissipation of accountability that is a by-product of the increase in the complexity of modern organizational systems and methodology. This can manifest itself in the
The dangers of such a development are compounded by a tendency in both government and business circles to adopt centrally set forecasts for the sake of consistency-despite an awareness that the forecasts may not be right-and often to use them as the basis for important decisions.
What
Role
is Left for the Planner?
If the forecasting method of planning is fraught with such dangers, and the forecasts themselves are basically unreliable, what can the planner do about it? A feature of many planning units today is the generation of rigid and complex long term planning cycles which may be good for the morale of the planners-whose achievements may be measured in terms of the amount of papers they produce-but which do little to aid decisionmaking since the decision-makers are unable to digest it all. The result then is that major decisions are taken outside the planning framework. When this happens, we are back to ‘gut feel’ judgment which, without adequate understanding of all the issues, may be based on reaction to short-term impulses (for example, a report in the day’s issue of an influential newspaper). There may be a temptation-which would be a logical extension of former developments-to put more and more effort into the administration, collection, processing and coordination of data, using a greater number of increasingly sophisticated systems, in an effort to produce more reliable forecasts and more trustworthy plans. But such an approach would simply make the system even more difficult to manage, and lead to further ‘paralysis through analysis’ (perhaps today’s greatest threat to decision-making). And if one accepts that the uncertain, it follows that however systems of analysis, and however into the computer, no single line be valid. So the basic objective of unrealistic. The problem may therefore insoluble-until one asks the planning really need forecasts?
future is always sophisticated the much data are fed forecast can ever such a approach is
appear question:
to
be Does
Corporate What, after all, is the purpose of planning? Basically, it is to provide a framework within which decisions throughout an organization can be taken. While this implies an understanding of the forces that will mould the future, it does not necessarily imply a need for single-line forecasts. Indeed, the logic of this definition points to the conclusion that it is not thejob of the planning unit to filter all the various bits and pieces of information to produce a streamlined, single-line perspective of the future. Rather, it is to provide a framework within which all the various factors and information can be more effectively and easily judged by the decision-maker. Thus the fundamental role of the planner is to promote conceptual understanding, rather than provide numerical quantification-though numerical quantification may have an important role to play in the process. Without a sound conceptual framework, the numerical quantifications are virtually meaningless, and it becomes increasingly difficult to sort out which bits of information are relevant to the decision in question. It is common to hear people worrying about things which may have little bearing on the decision that has to be taken. For example, somebody will say, ‘Before we decide whether to build, and whether to do so in Manchester or Glasgow, what is the GNP going to be next year ?’ GNP may be totally irrelevant to this decision (especially next year’s) but people feel that they ought to talk about GNP-so they do. It is patently obvious in such situations that the deficiency in the decisionis not related to numerical making process quantification; nor the need of a forecast. It is the lack of a sound conceptual framework. In such situations, the social and political factors may be of far more importance than economic yardsticks such as GNP. But these factors-though intuitively accepted as the most important in our private lives-tend to be pushed aside in our business lives. Many businessmen tend to shy away from these areas, and develop a glazed look when somebody mentions ‘social factors’. So the job of the planner is not only to provide a framework which can focus attention on these factors and promote an understanding of their significance to particular areas of decision-making; he must also make the framework as accessible as possible. In other words, it is not only the content that is important; it is the presentation as well. The framework must be tailored in such a way that it can overcome any resistance to a consideration of its contents, as well as make the content itself digestible. Since the number of possible futures is clearly infinite, one of the primary tasks of the planner is to
Planning
for an Uncertain
Future
17
choose a series of possible futures-or ‘scenarios’which the decision-maker can deal with, and which are wide enough to encapsulate those key issues which are important to the particular areas of decision-making. If a decision-maker is thinking of building a tin tack factory somewhere in the northwest of England, scenarios looking at global energy developments may simply serve to confuse him. The scenarios must have something to do with tin tacks and their use, and help to identify the forces which could make the project a success or failure.
The Shell Approach
to Planning
The Shell approach to planning has swung increasingly away from a mechanistic methodology and centrally set forecasts, towards a more conceptual, or ‘qualitative’ analysis of the forces and pressures impinging on the industry as a whole and on particular areas of decision-making within particular business sectors. It starts with an acceptance of uncertainty. If one accepts that uncertainty is a fact, and takes into account the likelihood of the unexpected, then one is obviously faced with an infinite number of possible futures, and quantification becomes impossible. What Shell planners try to do is to identify the key elements pertaining to a particular area of decisionmaking-the different competitive, political, economic, social and technical forces that are likely to have the greatest influence on the overall situation-and translate these into a framework for individual judgment. In a multinational organization, the higher level of management is likely to be most interested in ‘global scenarios’-looking at world-wide developments-while the focus becomes narrower as one proceeds into the more specialized functions, divisions and business sectors of individual companies. Some aspects of the global scenarios may nevertheless be pertinent to some of the more localized areas of decision-making-for example, developments in the Middle East will inevitably have an influence on the local energy situation. For each area of decision-making, different combinations of the relevant key factors are considered-thus producing a range of essentially different futures-and these are generally distilled into two or three broad scenarios which encapsulate, in archetypal form, the various technical and economic factors as well as the less quantifiable social and political factors. Each scenario has to be self-contained in the sense that the logic of the particular combination of factors in that scenario describes a feasible future. When this logic is subsequently subjected to
18
Long Range
Planning
Vol. 15
August
1982
numerical quantification and set out in graphical form-in terms of demand and supply, costs and so on-it is obviously very similar in appearance to the traditional type of forecast. This has sometimes resulted in confusion, semantic arguments, and the conclusion that scenarios are simply a set of singleline forecasts which can be added together and averaged out in the same way that different forecasts from different analysts are combined to produce a statistical mean.
amount to a single-line forecast. Thus we try to distil the range of possible futures into two broad archetypes.
The difference between scenarios and forecasts (in the way that ‘forecasts’ have conventionally been used) is more than semantic, however:
There will, of course, be many combinations which are incomparable, so this eliminates many possibilities right at the start. One is nevertheless still left with a large number of possible futures, and the process of distilling these down can best be described as a kind of mental juggling act in which some of the pieces are allowed to fall to the ground. There is a certain degree of trade-off between the objectives of intellectual purity, clarity and usefulness. Then the remaining elements are sorted out into the combinations which appear best to meet the three main characteristics we look for in the scenario range. These are:
Whereas a forecast is essentially a statistical distillation of probabilities and ‘expert opinion’, a scenario is an archetypal description of a possible future based on a mutually consistent grouping of determinants. A forecast is assumed to be the result of having taken all relevant factors into account and come up with the ‘best’ answer. It says ‘this is what is most likely to happen’. It is thus a ‘front line’ judgment in itself, and tends to dictate the final decision. A scenario, by contrast, says ‘here are some of the key factors you have to take into account, and this is the way these factors could affect your line of business’. A forecast stands alone, to be considered, accepted or rejected on its own. A scenario is designed to be considered in conjunction with other scenarios; it is valueless on its own. A forecast is intended to be regarded as an authoritative statement. A scenario is intended to be regarded as a tool to assist understandingas a backdrop to the decision-making process, rather than as an integral part of the decision itself. A forecast removes much of the responsibility for the final decision from the individual decision-maker. The multiple scenario approach does not. While the forecast approach to planning is fundamentally quantitative, the multi-scenario approach is essentially qualitative. Perhaps the philosophies:
greatest
difference
is in the
Forecasts are based on the belief that the future be measured and controlled. Scenarios
basic
can
are based on the belief that it can not.
How Many Scenarios? scenarios are necessary for the How many individual decision-maker? From experience in Shell, the fewer the better. The smallest number of scenarios, of course, is two-since one would
This objective may seem, on the face of it, to be unattainable. After all, five categories of factors (competitive, technical, social, political, economic) at, say, three different levels, theoretically gives us 243 different possible scenarios. How can one possibly reduce this to two, or even three?
1. Each scenario
has to be internally
consistent.
2. The range should be near the extremes of possibility-but not too near so as to be beyond the realms of probability. 3. They
must challenge
the decision-maker.
It is in fact surprising how often, in practice, the objective of working with only two scenarios turns out to be perfectly viable (with a lot of effort). There are some areas of decision-making, however, where three or more scenarios are needed to encompass all the key elements. Unfortunately, three scenarios produce a tendency-resulting from the traditional use of, and dependence on, singleline forecasts-for the decision-maker to concentrate rather too much on the numerical quantifications accompanying each scenario and to opt for the line in the middle of the range. Four or more scenarios tend to create confusion. So, wherever possible, only two are used. Prospective plans are then examined against the different scenarios to see how they stand up under the starkly contrasting circumstances described by the scenarios. One of the obvious things one looks for is the possibility of disaster under these contrasting circumstances-any likelihood of the company going broke, projects running out of money, people freezing in their homes. Thus an understanding is built up of the different risks involved in any decision. With a better understanding of the key determinants, the decision-maker is in a better position to test the flexibility of his plan-the extent to which it can be adapted in the face of unexpected events. By
Corporate against different allowing plans to be tested eventualities, the scenario approach also helps in the development of plans with the greatest degree of resilience-a word borrowed from biological terminology, relating to the ability of some species to survive in a number of different habitats, as opposed to other species that thrive in one particular environment but (like some high yielding strains of crop) are vulnerable to small climatic changes. However, the ultimate decision on whether to go for the plan with the greatest apparent resilience, or one which combines a high degree of risk with a high reward potential, is not dictated by the multiscenario analysis. While the analysis may assist the individual decision-maker to identify and weigh up the options open to him, it does not necessarily tip the balance towards one particular decision in the same way that the forecast method of planning does. The final responsibility is his. Figure
1 shows
four
major
categories
of factors
Planning
for an Uncertain
Future
impinging on the activities of Shell U.K. which could affect the performance of the company. One is concerned with political developmentsgovernment strategy, manifestations of extremisrh etc. Another is concerned with domestic economic circumstances-unemployment, inflation, balance of payments and so on. The third is concerned with systems and developments that are more directly related to the company-energy demand, technological developments in the pipeline, international socio-economic factors relating to energy, the world energy scene, changing supply conditions, prices etc. Lastly there are the ‘societal’ developments-expectations, life styles and so on. All these factors affect the company; they are interrelated and have to be studied and taken into corporate plans and account in developing strategies. From a study of all these factors one can build up scenarios which describe different combinations of social, political and economic possibilities for the
Unemployment
U.K. POLITICAL DEVELOPMENTS
/
ECONOMIC DEVELOPMENTS
INDUSTRY DEVELOPMENTS
I
Figure
1. Environmental
pressures
rower
C,r,.nnln
I
19
$?
[
Markets
] ( Technology
1
Long Range
20
Planning
Vol. 15
August
1982
U.K. In the example described below (developed in Spring 1980) it was found that three scenarios were needed to encompass a realistic range of possible developments. Figure 2 shows GNP growth.
their
quantification
in terms
of
1. The
‘Unresolved Conflicts’ scenario provides the strongest prospect for GNP growth in the short term, but not in the long term. This is a ‘muddling through’ scenario in which policy is largely determined on the basis of expedient compromize in response to short term pressures, rather than tackling the country’s underlying problems. It is in many ways the story of the 1960s and 1970s. Economic growth is well below the post-war average and even this is only possible with the aid of revenues from U.K. oil and gas, which are assumed to go largely into consumption. Whether the U.K. can go on muddling through once oil revenues start declining is an open question and we might end up, some time in the 199Os, with all our present problems exacerbated.
2. The ‘Revival’ scenario gives us highest growth in the longer term. It implies a change of attitude and restructuring of industry, away from the older declining fields and concentration on areas of growth. In essence it is the constructive use of the advantage of possessing indigenous oil and gas to tackle fundamental problems such as low productivity and resistance to change. This painful process is bound to be slow, with little or no net growth in the first 5 years. In concept, the ‘Revival’ scenario is possible under a government on either side of the political spectrum, but on the time scale set out it is only plausible if politics remain consistent-though not necessa-
rily identical to present Should there be a major direction, the possibility will still be there, but the is bound to be delayed.
government policies. change in government of achieving ‘Revival’ timing of any turn-up
3. The ‘Rake’s Progress’ scenario is to some extent an extrapolation of the social, political and macro-economic trends for the U.K. in recent years. It could come about through a failure of the present government’s strategy leading to a series of increasingly desperate policy changes, none of which are given enough time to work. This could result in falling living standards, difficult business conditions and an increasingly unstable social and political environment. Of course, pressures to find solutions and get back to growth would, under such a scenario be immense-but because of the time scale required for the system to adjust, it is doubtful whether much growth could be achieved in the 1980s.
Looking
at Demand
These general scenarios were in turn used to quantify the range of future energy demand. When comparing the Revival scenario with Rake’s Progress, it was found that there were two countervailing forces on energy demand. In the Revival scenario, one can expect increased energy efficiency and greater development of less energy intensive industries; under Rake’s Progress equipment becomes less energy efficient as it ages and the older, more energy intensive industries have a greater share of the dwindling national cake-thus keeping energy demand relatively high in that scenario. As a result, the difference between these two scenarios
in energy demand could be compara-
150
125 1 Unresolved Conflicts
Rake’s Progress + 0 1985
rigure
2. Structure
of the U.K.
economic
1995
1990
scenarios
(GDP
0-Q
output
in absolute
terms)
2000
Corporate tively small even by the mid 90s although scenarios are otherwise in stark contrast.
the
Here then one can see a clear example of the difference between use of scenarios and forecasts. Although the total energy demand under the two scenarios might be similar, the outcome of a company’s strategy or of a project would be vastly different under the two scenarios. In Revival a company would be operating in a self-confident, outward-looking national environment, while under Rake’s Progress it would be in something akin to a siege economy, with all that implies in terms of taxes, import barriers and confrontation. By looking at such different scenarios, managements can accept more readily the possibility of difficult futures and plan accordingly, seeing such a future as one of several. By contrast, a very pessimistic single line forecast can either lead to fatalism or to disbelief and the search for other, rosier forecasts. In neither case will there be much attempt to consider policies in the event of the more difficult future materializingand perhaps that makes such a future just a little more likely to come about.
Conclusion The prime purpose of the planning process must be the provision of a framework of information and knowledge against which individual decisions can be taken. And it must be the aim of the
Planning
for an Uncertain
Future
21
Planning Department to ensure that the right conceptual framework is available at the right time. The basic purpose of scenarios is to provide decision-makers with such a framework; it must not only be a practical tool for decision-making, but must also help to create understanding of those elements of the future which could impinge on the decision. The introduction of the scenario approach has been found in practice to be a difficult process, since many people have become ‘hooked’ on single-line forecasts, and the ‘withdrawal symptoms’ can be traumatic. It is not easy to relinquish the pseudocertainty that forecasts provide and to face up to the constantly shifting features of reality. Thus there is a dual challenge to the planner: He must not only help to identify the forces behind changing circumstances, but must help wean decision-makers from their dependence on single-line forecasts. It is inevitable that the size of this task is in proportion to the size of the organization concerned. A more conceptual approach to decisionmaking is by definition a more challenging one-and thus offers greater potential for individual responsibility and fulfilment. While in theory this is what people want, it may not always be the case in practice.