The nonlinear dynamics of technoeconomic systems

The nonlinear dynamics of technoeconomic systems

Technological Forecasting & Social Change 69 (2002) 317 – 357 The nonlinear dynamics of technoeconomic systems An informational interpretation Tessal...

1MB Sizes 0 Downloads 86 Views

Technological Forecasting & Social Change 69 (2002) 317 – 357

The nonlinear dynamics of technoeconomic systems An informational interpretation Tessaleno C. Devezasa,*, James T. Corredineb,1 a

Faculty of Engineering, University of Beira Interior, 6200-001 Covilha˜, Portugal b 24 Bailey Court, Middle Island, NY 11953, USA

Received 1 May 2001; received in revised form 13 June 2001; accepted 16 June 2001

Abstract In this paper, a cybernetic framework is proposed that may help in understanding the specifics of the timely unfolding of recurrent social phenomena, as well as provide a basis for their application as useful forecasting tools for futures studies. The long-wave behavior in technoeconomic development was chosen to apply this theoretical framework. The time evolution of a technoeconomic system is described discretely as a logistically growing number of ‘‘interactors’’ adopting an emerging set of basic social and technological innovations. By using the logistic function as the probabilistic distribution of individuals exchanging and processing information in a finite niche of available information, it is possible to demonstrate that the rate of information entropy change exhibits a ‘‘wavy’’ aspect evidenced by a four-phased behavior denoting the unfolding of a complete long wave. The entire unfolding process, divided into two cycles, an innovation cycle and a consolidation cycle, is analyzed, and two very important threshold points are identified and discussed. The present theoretical analysis suggests that the technoeconomic system is not a purely chaotic process, but exhibits a limit-cycle behavior, whose basic mechanism is the periodical deployment and filling of information in a ‘‘leeway’’ field of active information. The pace of the process, and hence the duration of the long wave, is determined by two biological control parameters, one cognitive, driving the rate of exchanging and processing information at the microlevel, and the other generational, constraining the rate of transfer of knowledge (information integrated into a context) between successive generations at the macrolevel. Moreover, it is speculated that social systems mimic living systems as efficient negentropic machines, and making use of Prigogine’s entropy balance equation for open

* Corresponding author. E-mail addresses: [email protected] (T.C. Devezas), [email protected] (J.T. Corredine). 1 Also corresponding author. 0040-1625/02/$ – see front matter D 2002 Elsevier Science Inc. All rights reserved. PII: S 0 0 4 0 - 1 6 2 5 ( 0 1 ) 0 0 1 5 5 - X

318

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

systems, it is suggested that its cyclical behavior is probably the best way to follow nature’s efficiency strategy. D 2002 Elsevier Science Inc. All rights reserved. Keywords: Long waves; Kondratieff waves; Socioeconomic growth; Logistic growth; Chaos theory; Systems science; Information theory; Cybernetics

1. Historical background: physics and economics Models of political and economic concepts have historically followed the lead of the natural sciences emulating the intellectual and methodological paradigms of their corresponding periods. Plato and Aristotle were the first philosophers who tried to explain the emergence of political, social, and economic order in human societies. Although differing in the details of the transformation processes involved, both described social and political dynamics as processes in nature, reaching a final state of equilibrium when the social and political form of a polis (harmonious society) is realized. It is worthwhile to note that the very notion of equilibrium arose among Greek philosophers some 25 centuries ago. Thomas Hobbes (1588 – 1678), impressed by the new mechanistic view of nature introduced by Galileo and Descartes, projected a mechanistic model of society and state, breaking for the first time with the obsolete traditional metaphysics. Hobbes tried to transfer the laws of movement from mechanics to social systems, assuming that human beings were driven by emotions (human impulses) like physical bodies are by mechanical impulses. His figure of an absolute sovereign (Leviathan) legitimized to rule society in order to keep it in perfect equilibrium, and defined as the sum of all individuals, added to the prevailing notion of equilibrium a new aspect borrowed from the physical sciences: that of linearity or the mechanical principle of superposition. This mechanistic view of an economy, strongly impregnated by the notions of equilibrium and linearity, was definitively entrenched into economic thought by the French scientist Franc˛ois de Quesnay (1694–1774), the founder of the Physiocratic School in economics. An economic system was modeled as a mechanistic input/output clockwork, a sequentially (linear) operating system with programmed functions, in the most pure framework of Cartesian mechanics. Later in the 18th century, Adam Smith (1723–1790) introduced the Newtonian method to ethics and economics. In Cartesian mechanics, all physical events had been explained through the contact effects of interacting elements like cogwheels in a clock or impulses between balls. Newton introduced a new vision of the universe, a system of moving material bodies in dynamic equilibrium governed by the universal law of gravitation, an invisible force resulting from the quantity of matter contained in each constituent of the system. Now, a macrostate of equilibrium was explained as a result of all microeffects between bodies, controlled by the invisible hand of gravitational forces. For Smith, the interactions of several microinterests achieve the common macroeffect of general welfare by the mechanism of the market, a picture of an invisible hand directing microinterests and competition among individuals, but resulting in a final state of equilibrium between supply and demand.

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

319

The Newtonian paradigm continued to profoundly influence economic thought in the following centuries. Not only were the pervasive notions of equilibrium and linearity always present, but also the notion that any disturbance of this equilibrium had to have an exogenous cause. A trenchant example of this vision can be seen in a famous quotation of John Stuart Mill (1806–1873) [1]: ‘‘The disturbing causes have their laws, as the causes which are thereby disturbed have theirs; and from the laws of the disturbing causes, the nature and amount of the disturbance can be predicted a priori . . .. The effect of the special causes is then to be added to, or subtracted from, the effect of the general ones.’’ Embodied also in this statement is the conception that small causes have small effects, and that big causes have big effects, another way of expressing the linear relationship between cause and effect. Mathematical economists of the 19th century were conscious of the complexity involved in modeling economic systems, but this complexity was viewed only from the point of view of the enormous variety of variables and entities to be considered, and not as a consequence of any nonlinear relation connecting cause and effect. The Laplacian spirit of classical physics remained present in economic models, assuming that perfect calculations of forecasts by economic laws were possible, if initial conditions could be completely determined. Following this assumption, mathematical modeling of economics tried to isolate the elements of the complex social system from its environment. The characteristics not considered as elements for economic models were specified as exogenous parameters. Important elements of social dynamics such as radical innovations or oscillations in population were viewed as exogenous factors. Entering the 20th century physical linear models continued to influence economic theorists, who increasingly borrowed their vocabulary from physics, for instance, stability, movement, equilibrium, inflation, expansion, contraction, elasticity, flow, pressure, force, potential, reaction, friction, resistance, not to mention the usage of even more intangible notions like order, entropy, or temperature. This trend remained unchanged, although physics itself underwent a dramatic change in the first quarter of the 20th century with the birth of quantum mechanics, marking a sudden break with classical physics. It is noteworthy to mention that the quantum formalism of Schro¨dinger, Planck, and Heisenberg is also linear, but that the nonrelevance of the quantum–mechanical approach to the reality of economic and social systems introduced a vacuum in the applicability of modern physics in economic models. We can say that in the 20th century economists have gradually given up the exacerbated physicalism of previous centuries and have tried to develop their own mathematical formalism, but not abandoning linear approaches. After the economic crash of 1929, economists began to speak of an inherent instability in the capitalist system and its incapability of self-regulation. John Maynard Keynes (1883–1946) suggested then that the economic system could be stabilized from outside by particular instruments of policy like fiscalism. Followers of the Keynesian doctrine began to criticize the linear framework of equilibrium theories, but were unaware of mathematical nonlinear methods. Even modern economists, like Paul Samuelson, could not explain large-scale observed oscillations in the economy if not as the result of the agency of exogenous shocks. Nevertheless, although distanced from pure mechanist approaches, from a methodological point of view mainstream economics remained inspired by models from linear

320

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

mathematics, thermodynamics, classical mechanics, and also to some extent by Darwinian evolutionary theory.

2. A paradigm shift The computer pioneer John von Neumann (1903–1957), who introduced the Theory of Games into the mathematical economics, believed in the possibility of precise long-term weather forecasts, maintaining that given sufficient data about global weather and a supercomputer, he could provide accurate predictions of weather over time and space. But von Neumann did not recognize the pervasiveness of nonlinearity. At this time economists were also unaware that economics suffered from the same shortcomings as meteorology — very complex systems could not be explained by linear mathematics. A new epoch could have started in 1963 when the MIT meteorologist Edward Lorenz published his seminal opus ‘‘Deterministic Nonperiodic Flow’’ [2], proposing a nonlinear dynamic model for atmospheric flow. Lorenz had trained as a mathematician and after WWII became a meteorologist. As such, he published his paper in the obscure venue of the Journal of the Atmospheric Sciences. Meteorologists, who were either nonmathematical or versed only in traditional mathematics, did not know what to do with his set of equations, whose solutions exhibited such bizarre and apparently nonsensical behavior. It was necessary to wait a decade, until physicists could discover the very significance of his work. Since the 1930s mathematicians and physicists parted ways and were no longer speaking the same language. Topology with its manifold mappings and differential geometry did not find a broad range of application in physical problems. During this divorce period, physicists, chemists, astronomers, and biologists independently accumulated a series of phenomena, not describable by conventional sets of differential equations. These phenomena mainly concerned dynamical and repeating but nonperiodic systems, and sudden phase transformations. The brilliant French mathematician Rene´ Thom marked the turning point in this trend in 1972 with his ‘‘Stabilite´ Structurelle et Morphoge´ne`se,’’ showing that mathematicians have something to say in describing real world phenomena. Following this, the scientific language of the time overflowed with new and revolutionary concepts like catastrophes, synergetics, dynamical systems, dissipative structures, far-from-equilibrium and self-organization, mainly linked to the names of Rene´ Thom, Herman Haken, Stephen Smale, and the Nobel laureate Ilya Prigogine. In this fertile ground grew the seeds of a new branch of physics, most commonly referred to as chaos theory. Through the seminal papers of Tien-Yien Li and James Yorke [3] and Robert May [4] in 1975 and 1976, respectively, rediscovering Lorenz and introducing into science the notion of deterministic disorder, it was shown that chaos is ubiquitous, stable, and structured. Scientists began to understand that complex systems, traditionally modeled through difficult continuous differential equations, could be described in terms of easy discrete maps, undoubtedly one of the most important scientific achievements of the last three decades. The definitive thrust establishing this new discipline in the scientific scenario was made by a group of young physics graduate students at the University of California, Santa

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

321

Cruz, who founded the Dynamical Systems Collective. Their names, Robert Shaw, Doyne Farmer, Norman Packard, and James Crutchfield, are today definitively linked to the birth of chaos theory. In this concentration of new ideas and concepts of the 1970s surfaced the bewitching world of fractals introduced by Benoit Mandelbrot and compounded by the notion of scaling developed by Mitchell Feigenbaum. The common denominator in all these developments was the fact that the complexity of the world around us is a surface manifestation of a deeper simplicity. By the 1980s, the concept of deterministic chaos theory and nonlinear dynamics was already entrenched in many fields of science, including some branches of medicine and biology, with a few excursions into economics and social sciences. The paradigm shifted, and scientists developed awareness that the Newtonian paradigm is best fitted to a static or slowly changing world of stability and continuity, and not one of evolution, instability, and structural change. Thoughtful individuals began to become aware of the very broad range of deterministic chaos applications. Robert May, a mathematician–biologist, realized that the astonishing structures he was exploring were not just intrinsic properties of biological systems. May, in his famous review paper in 1976 for Nature [4], was the first to put forth a more embracing view, stating: ‘‘Not only in research, but also in the everyday world of politics and economics, we would all be better off if more people realized that simple nonlinear systems do not necessarily possess simple dynamical properties.’’ Perhaps few people are acquainted with the interesting fact that, at the time May was developing this thought, Mandelbrot was constructing his images of fractals and symmetry through scales looking at cotton price time series going back to 1900, a tumultuous period that saw a deep depression and two World Wars. This very insightful example stresses that the same quality underlies biology and economics: the signature of order.

3. Patterns in technoeconomic evolution Economy in the 20th century was characterized by growth interrupted by two spectacular collapses, the Great Depression of the 1930s and the Oil Crisis of the 1970s. Both these occasions gave way to two decades of long economic decline. Historically, the period around the 1930s inspired intense discussion and research activity in business cycles, with special attention given to the work of the Russian Marxist economist Nikolai Kondratieff [5] that had surfaced in the middle 1920s. Kondratieff was not the first to support the idea of long-term economic cycles of about half-century duration, but was the first to organize substantial empirical evidence for the idea and spark a sustainable debate on the topic. Neoclassical economic models, strongly based on linear mathematical modeling and equilibrium-seeking systems, have been less than successful in handling long-term trends and explaining long-term oscillations in economics, requiring the agency of exogenous shocks to explain these irregularities. Accordingly, the actual economic systems are modeled as regular equilibrium-seeking systems with superimposed stochastic exogenous shocks that have to be explained by appropriate economic assumptions. With regard to the dramatic social and political consequences of economic catastrophes, several policy measures have

322

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

been discussed in the framework of Keynesianism or Neo-Keynesianism, as for instance compensatory fiscal policy. But practical experience has shown that there is little hope of damping oscillations or holding the degree of employment constant. Good regulatory measures need considerable time for the collection of data and analysis of results, and the consequence is that any policy may be outdated by the time it is enabled, even becoming counterproductive when applied with some time delay. We can say that concrete steps in the direction of explaining long-term economic oscillations as a result of endogenous mechanisms of the capitalistic form of production began in the late 1930s at Harvard with Joseph Schumpeter (1883–1950), who described economic evolution as technological progress in the form of ‘‘swarms’’ arising during the phases of economic depression. These ‘‘swarms’’ consisted in the clustering of technological innovations that originate new economic activities and new industrial branches (leading sectors) that pull the next phase of economic expansion, until tear and wear, overproduction, and saturation become manifest causing a new depression. Schumpeter was the first to call the 50–60-year pattern the long wave or Kondratieff cycle. Schumpeter also introduced the concepts of gales of creative destruction and the entrepreneurial motivation of new men as endogenous social driving mechanisms of the long-wave phenomenon. The German economist Gerhard Mensch at the end of the 1970s advanced a decisive contribution to the understanding of the underlying dynamics of long waves in socioeconomic development. Mensch [6] has proven exhaustively that a real cluster of basic innovations is evident at each downswing phase of the long wave, and proposed a metamorphosis model, whose key concept is that of a technology stalemate out of which an economy is ultimately eased by clusters of innovations taken from a reservoir of investment opportunities formed by a continuous flow of scientific discoveries. As old activities are disused and replaced by revolutionary new ones, a structural metamorphosis takes place, a phenomenon to which Devezas [7] and Devezas and Corredine [8] proposed the name of succeeding technospheres, considering that at the start of each wave there is observed a dramatic shift toward a mesoscale technological transformation. This transformation, pulsating regularly each half-century, is characterized by the synergistic combination of new technologies, the creation of new economic activities and industrial branches, and a deep transformation of society as a whole—in summary, a new technological environment, a new technosphere, emerges, grows, and diffuses in the body of society, killing or reviving previous technologies, until a leveling off is reached. Today’s youthful, growing global information and communication system is a living testimony to this argument — we are witnessing the blossoming of a new technosphere and about to ride up a new long wave of the world economy. Mensch put forth the concept that the economy has evolved through a series of intermittent innovative pulses that take the form of successive S-shaped life cycles, in phase with the succession of long-term wavy economic fluctuations, or Kondratieff waves (K waves for short). A pictorial representation of this succession of S-shaped curves and K waves is presented in Fig. 1, inspired by Mensch’s original graph. Under this conceptual framework we can say that the economic long wave is the result of a swarming of basic innovations that forms an organic technoeconomic system (techno-

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

323

Fig. 1. Schematic presentation of the succession of K waves (a) and intermittent innovative pulses (b), inspired by Mensch’s [6] metamorphosis model.

sphere), whose unfolding is well described by the life-cycle template. The concept of life cycle is a metaphor borrowed from biology and describes the growth of a process, the coming into and out of existence. Its template represented in Fig. 2 describes the unfolding of

Fig. 2. Life-cycle template.

324

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

the rate of growth as a function of time, crossing the different stages of birth, growth, maturity, decline, and death. Economic indicators (through which K waves were identified and quantitatively measured) like prices, inflation, production, etc., consist in a manifestation of this deeper underlying cyclical innovative behavior, that evolves with time following the same patterned behavior. The life-cycle template is translated mathematically by the first order differential Verhulst equation dN ¼ dN ðM  N Þ dt

ð1Þ

where N(t) is the measured growing quantity, M is the upper limit of N(t), and d is a rate parameter that translates the system’s growth capacity. The solution of Eq. (1), translating the cumulative S-shaped growth of the quantity N(t), is the well known logistic equation: N¼

M : 1 þ edðtt0 Þ

ð2Þ

Thus, we can state mathematically that the bell-shaped form of the long wave is the first derivative of the logistic pulses of human innovative behavior. Searching then for the understanding of the underlying dynamics of long waves in socioeconomic development, we must concentrate our attention on the deep significance of this interpretation. Marchetti [9,10] through a series of publications in the 1980s contributed further insight to this conceptualization, via the study of physical rather than econometric variables. He could show that almost anything, innovations, social moods, infrastructures, energy consumption, etc., pulsates with a periodicity of about a half century, and put forth the concept of society operating like a learning system governed by the logistic equation, demonstrating that invention, innovation, and entrepreneurship, generally perceived as the most free of human activities, are actually governed by natural rules. His figure showing the secular trend is striking. The patterned logistic behavior intrinsic to inventions, innovations, infrastructures, and primary energy sources crystallizes as the revelation of an unsuspected order underlying societal evolution. Moreover, some authors [11–14] have suggested that these long-term oscillations of societal processes are not a recent phenomenon, characteristic of the industrial society, but have been manifest since ancient times, just more easily inferred using measurable parameters since trustworthy data became available since the invention of the statistical sciences. Modelski and Thompsom [14], using historic empirical data, could show a series of S-shaped surges throughout the millennium. The patterned behavior depicted in Fig. 1 strongly suggests that an ordered but nonlinear dynamic is subjacent to the phenomenon. The ubiquitous logistic differential equation provides a useful mathematical tool to describe the unfolding of the long wave, but it is not sufficient to explain the causality. In the present paper, it is suggested that a good insight

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

325

into the discussion on the causality of long waves may be attained looking at the informational entropy content of an evolving technosphere, described as the logistic diffusion throughout society of a new technoeconomic structure.

4. Setting the problem As we have seen there is already robust evidence that there are patterns in the evolution of technoeconomic systems. Expressed more exactly, we are facing a recurrent phenomenon that has been labeled as ‘‘cycles’’ and most commonly referred to as ‘‘long cycles’’ or ‘‘long waves’’ in technoeconomic or socioeconomic development. But an inherent difficulty in modeling this phenomenon has been reaching a conclusion about its ‘‘periodicity.’’ Cyclical approaches based either in patterns with a fixed periodicity or in a broader concept of not precisely fixed, that is, with a limited range of variation, have been the targets of severe criticisms. Most of these criticisms arise mainly from the difficulty in formulating a rigorous, empirically testable social science definition of a cycle. ‘‘Cycle’’ is a concept borrowed from the physical sciences, and essentially denotes an exactly repeating phenomenon. How then to justify its use in describing a class of phenomena that apparently is not strictly cyclical? There is actually a reasonable consensus that social systems are dynamical nonlinear systems, whose evolution mimics the evolution of living systems, that is, dissipative systems with positive feedback and increasing complexity. If this is the case, the application of chaos theory can provide a formal framework with which to enhance both the methodological and theoretical discussion of ‘‘cycles’’ in social systems. Chaos theory represents the most recent effort by economists and social scientists to incorporate theory and method from the natural sciences to apply to social systems. This approach, while in its infancy, is a very useful heuristic device to gain a deeper understanding of societal dynamics. The questions to be answered are then: what kind of cycles are we facing in the evolution of technoeconomic systems? Are they strict cycles (patterned repetition with fixed periodicity), or are we dealing with dynamical systems with chaotic behavior, with a continual evolution that may show glimpses of pattern but never perfectly repeated? Or perhaps a blend of both? And if not strict cycles, why not? It is far from our purposes here to give definitive answers about the above questions. Our intention is only to shed light on some aspects of the chaotic dynamics of social systems. At this point it is very important to stress that, although our analysis will be centered on the phenomenon of long waves (or Kondratieff cycles) in technoeconomic systemic evolution, other research results on long-term evolution of social systems strongly suggests the existence of other invariant structures inside the complex flux of things we perceive in the social realm. Among these invariant structures are cycles in evolutionary world system and global politics as suggested by Modelski [15–17]. In his many publications Modelski put forth the concept that world system change is an array of coevolving evolutionary processes, exhibiting fractal structure, that is, self-similarity or symmetry across scales. Richards [18] in a very interesting and insightful article, presented a chaotic model of power concentration in the international system. Using the already mentioned historic empirical data from Modelski and Thompson

326

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

[14], she could show that the evolution of systemic power concentration since 1494 is not strictly cyclic, but is chaotic and determined by structural constraints on feasible paths and outcomes. This finding confirms the suspicion that there is indeed some regularity in the observed historic patterns, but its evolution is chaotic and thus sensitive to small random or microlevel effects, leading to contingent rather than effective long-term prediction. The true richness of our world comes exactly from this noisy and apparently random behavior that prevents and imposes limits in forecasting the unfolding of social phenomena. But chaos scientists have succeeded in extracting some order out of randomness and developed a set of criteria to characterize the dynamics of chaotic systems, which allow separating the predictable and ordered from the unpredictable and random in their behavior. Not all complex systems are chaotic and nonlinearity is a necessary, but not sufficient, condition for deterministic chaos. The diagnosis of chaos is not a simple task, because a number of different determinations are necessary. There is no one measurement or calculation that can establish the existence or absence of chaos. For a system to be technically chaotic, certain specific conditions must be met and include [19,20]: 1. 2. 3. 4. 5. 6.

the system must be nonlinear and its time series should be irregular; randomness must be present; the system must be sensitive to initial conditions; the system must have strange attractors, that is, fractal dimensions; if the system is dissipative, the Kolmogorov entropy should be positive; last and most important, there must exist positive Lyapunov coefficients.

Moreover, the adherence to this set of criteria does not exhaust the discussion on the identification of a chaotic system. There are two other major issues to be considered. First, and related to the point five above, is a conclusion about the system being dissipative or not. If dissipative, and thus involving self-organization, considerations about entropy change are paramount. The second issue is whether the stochastic component is the randomness inherent to chaos or whether it is noise, or even a blend of both. Gaining knowledge of the application of these criteria to social systems and their subsystems is as yet at its very beginning, and most of the published work in this field is addressed in examining the first four points listed above. In the case of economic systems, there has been observed an effort to distinguish between noise and chaotic behavior in economic time series. Much of the limitation in the application of these criteria for the chaos diagnosis of social systems arises from the absence of necessary and trustful long time series. The objective of the present paper is not the application of techniques to test for chaos in the long-wave behavior of socioeconomic systems. Berry [21,23,24] and Berry and Kim [22] has already considered the essentials of this kind of discussion in a series of publications in the last 10 years. He could successfully show the clear existence of patterns in the first return map of different economic time series, and provided evidence that the movements of prices and economic growth have been chaotic, exhibiting systematic rhythms with lots of noise during the last 200 years [21,22]. More recently, he complemented this study applying digital

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

327

spectral analysis techniques to the same set of economic time series [23,24], and succeeded in confirming that very noisy inflation and economic growth movements share very regular short-, middle-, and long-wave rhythms. Our discussion will be focused in the aspects related to point 5 above, not yet well explored in the long-wave literature. It is our belief that an adequate knowledge of the entropy-related endogenous logistic growth dynamics of technoeconomic systems will allow future developments in the practical application of the long-wave concept as a powerful forecasting tool. These developments will help to achieve mathematical models that may be simulated by computer experiments, aiming to predict at least their short-run evolution.

5. A cybernetic framework The discussion put forth in the following sections intends to introduce a cybernetic based framework for the interpretation of the long-wave phenomenon that should lead to its more efficient further modeling and clearer comprehension. For this purpose, we must first discuss some aspects related to the established explanation and description of long waves of socioeconomic growth and development. As we have seen, the succession of long term wavy fluctuations in economic life can be interpreted as a result of a series of S-shaped life cycles of human innovative behavior, translating the coming into and out of existence of a dominating technoeconomic system or technosphere. S-shaped or logistic curves have been widely and successfully used since the 1970s to describe processes of technology substitution and technology diffusion in society. Cesare Marchetti and some of his followers, like Theodore Modis and Arnulf Gru¨bler, contributed to the valuable insight of society operating like a learning system governed by logistic equations, raising the logistic function to the status of a natural law of technological progress. The usage of logistic curves put forth by Marchetti, Modis, Gru¨bler, and many others, refers to the cumulative counting of some discrete physical quantity, like innovations, discoveries, inventions, energy production, consumables, etc., in short, things that are perused and used by human beings and whose consumption grows filling a given niche. The analogy to population growth is clear and translated by the differential Verhulst equation (Eq. (1)), which translates the fact that the rate of growth is proportional to the size of the population and to the size of the niche remaining to be filled. In a recently published paper, Devezas and Corredine [8] have advanced a GenerationalLearning Model for the long wave. In this model the technoeconomic system is concieved as an evolving learning dissipative structure, representing a new technoeconomic environment that is created, used, and exhausted, following the path of an overall logistic curve, and dragging within it not only new technologies and industries, but also new ways of life, new occupations, and new forms of organization, not only in business but also in politics and social order. This overall logistic growth is the envelope curve encompassing two logistic structural cycles, whose unfolding is reproduced here in Fig. 3. The first logistic structural growth curve, corresponding to the downswing of the long wave, is triggered during the ceiling (saturation) of the previous technoeconomic system, a

328

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

Fig. 3. The Generational-Learning Model of long waves. For a detailed description of the model, see Ref. [8].

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

329

period of high instability, economic stagnation, and recession, soon followed by a deep economic depression. It corresponds to a phase of renovation, mutation, and selection, during which basic innovations accumulate and interact synergistically, and new ideas emerge apparently out of nowhere. Once the necessary knowledge is established and accumulated, and the new technological environment is reasonably entrenched, the economic expansion starts again. It is the upswing of the long wave, corresponding to the second logistic structural growth curve, a phase of consolidation during which the established technosphere matures and reaches its ceiling, giving way to the blossoming of the next technosphere. For detailed discussion of the model, see Ref. [8]. The points of interest here are that we are speaking of structures that are self-created, grow, mature, become enfeebled, and are replaced. And we are using logistic curves to describe their unfolding in this ever-repeating process. The entire structure forming the organic technoeconomic system is represented by the elegant sigmoid curve. Three main questions may then be addressed: (1) What is this structure we are talking about? (2) What is the discrete growth variable? and (3) Why does a given structure disappear and get replaced? The possible answers to these three questions will be discussed in the next sections, wherein we will establish a new theoretical framework, which, as we will see, is a cybernetic one. Cybernetics is a relatively new scientific discipline, established in 1961 by Norbert Wiener (1894–1964) in his seminal opus ‘‘Cybernetics or Control and Communication in the Animal and the Machine.’’ Its original scope was the study of the relations between structure, interactions, and behavior in the living organism and in the man-made machine, their common features, and their description. While the main consequence of Wiener’s work has been the development of mathematical models of behavior and interaction, utilizing engineering methods of general systems control, the understanding of what is cybernetics has expanded significantly. We can say that today cybernetics had become the science of relationships, control, and organization for all types of objects, describing physicochemical, biological, social, and technological phenomena with equal success. Wiener demonstrated the relationship between physics and information theory, now applicable across several scientific fields. It is in this realm, with the additional consideration of deterministic chaos concepts, that we will begin to introduce our theoretical framework.

6. Order and control parameters Let us begin with the first question above. In the general scheme depicted in the model presented in Fig. 3, we are dealing with structural change within the world system and with the effects of this change on constituent subsystems. Within this context we are considering an evolving macrostructure that is a sociotechnical, technoeconomic, and macropsychological (collective–cognitive) structure. This evolving process is the wearing out and exhaustion of given macrostructures and their replacement by new ones that are better fitted to the new evolutionary situation. Evolving systems show feedback between macroscopic (collective) structures and events of individual interactions at the microlevel. Macrostructures emergent from the microlevel,

330

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

in turn, modify the individual interactions at each stage of the irreversible evolution, which means that we are dealing with a dissipative process. An order parameter is such an outgrowth of micro–macro interactions; it is a macrostructure or macrovariable that emerges along with a reduction of degrees of freedom of the system. The evolving technosphere is our order parameter. The long wave is our perception of it, functioning like a pattern recognition process. In the framework of complex systems, the behavior of human collectivities is explained by the evolution of macroscopic order parameters that are caused by nonlinear microscopic interactions of humans or human subgroups (nations, states, institutions, etc.). Social or economic orders may be interpreted as attractors of structural change. Using the language of synergetics, we may say that at the microscopic level the stable modes of the old states are dominated by new unstable modes (the ‘‘slaving principle’’). The emergence of a new structure results from the fact that the nucleating unstable modes serve as an order parameter determining (‘‘enslaving’’) the macroscopic behavior of the system. The rate of change from old to new is codetermined by control parameters related to the type and intensity of interactions involved. At this point, it would be a good illustrative exercise to observe how in past long waves and corresponding technospheres, some leading technological innovations, or leading sectors, played their role as order parameters. In other words, how emergent unstable modes (leading basic innovations) acted in ‘‘enslaving’’ all other human activities and in feedback effect affected human interactions at all levels. Probably, the best example we all are now experiencing is with the new and pervasive global information and communication system lead in the last dozen years by the Internet and personal computers. Who doubts that all other economic sectors and technologies are being ‘‘enslaved’’ by cyberspace, which in turn is profoundly affecting microlevel human interactions? As we stated in Ref. [8] we are still experiencing the first stage of this process, the innovation phase, but in the coming decades, in the consolidation phase, this emergent technosphere or order parameter will be definitively entrenched. In order to continue following our line of reasoning, it is important to stress that in the cybernetic framework systems are defined in terms of interactions based on information exchange among system elements, or better, system description parameters must be resident only in the interactions within the system and nowhere in the components. CarvalhoRodrigues and Dockery [25] demonstrated the validity of this approach showing that removing or seriously damaging systems components decreases the system capacity to communicate information between components and reduces the cohesiveness of the system. On this basis, components become interactors, which signal each other through the exchange of information mediated by some interaction mechanism. Two close linked concepts were then brought to our cybernetic framework: structure and information. Both form the basis upon which things unfold. Structure arises from dynamic information exchange. To describe the dynamics of the unfolding process two other concepts must be considered: learning as the counterpart of information exchange, and entropy, the counterpart of information. The former is discussed below, and the second will be discussed in the next section.

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

331

The process of information transfer is fundamentally a learning process. The rate of information transfer is initially low, then overcoming the inertial resistance of the system, grows, reaches a maximum, slows down, and reaches a ceiling, following then a typical logistic growth pattern. What is being transferred are discrete inputs and outputs or bits of information. In our previous publication [8], we have discussed in depth the very important and universal significance of the learning programmed evolutionary path of living systems. In the evolutionary process, a system self-organizes and learns configuring and reconfiguring itself toward greater and greater efficiency, and in this manner, with each iteration, performs some activity better. Each stage corresponds to a given structure that encompasses previous self-organization, learning, and the current limitations. This is to say that self-organization and learning are embodied in the systems structure, and the learning rate is an overall systems property. One of the fundamentals of the Generational-Learning Model [8] consisted of the very important role played by the learning rate as a codeterminant of the timing of a long wave. Analyzing the mathematical relationship between the differential (continuous) and discrete logistic equations, we demonstrated that the rate parameter d of Eqs. (1) and (2), and the gaindetermining constant k of the recursive discrete logistic equation xnþ1 ¼ kxn ð1  xn Þ

ð3Þ

are closely related through the expression k = dtG, where d, the diffusion-learning rate, is the cognitive biological determinant, the rate at which humankind learns to deal with new environments, and tG, mathematically known as the characteristic time of the logistic function, is the effective generational determinant, consisting of biologically based rhythms. Further discussed were the possible values assumed by the coupling dtG in the light of deterministic chaos theory, concluding that sustainable growth and evolution requires 3 < dtG < 4, granting the necessary oscillations or chaotic behavior. The system is said to be chaotic within predicable boundaries. Social systems are complex adaptive systems exhibiting manifold stability, and [d,tG] are the biological control parameters of the longwave behavior, determining the rate of change of the whole process.

7. The many perceptions of entropy Inasmuch as we are speaking of transformation and change, nothing is more appropriate than to speak about entropy. Since its introduction in 1850 by Rudolph Clausius (1822–1888), the concept of entropy, from the Greek en- (in) + trope¯ (change), has expanded significantly, transcending the limits of the physical sciences, and assuming many different personalities. The expansion of scientific knowledge is normally accompanied by a corresponding expansion of scientific language, and some useful concepts are used across disciplinary boundaries. This is the case with the two Greek rooted and inextricably linked concepts of energy and entropy. Both are mental and intangible constructs that humans have developed to describe the real world that cannot be directly

332

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

accessed. The human perception of energy is through its multiple transformation processes. It was in the effort of quantitatively measuring energy being transformed that physicists perceived that some amount of energy gets irreversibly lost. Clausius coined the name entropy to designate this lost and not useful energy in the context of thermal transformations. Entropy is then something that we cannot use, we cannot feel, we cannot measure, and we cannot precisely define. It is nothing more than a difference between useful forms of energy, but what a powerful difference. It is the difference between reversible and irreversible processes, the difference that introduced into the laws of physics the notion of time–direction. The real thrust toward the very significance of the embracing concept of entropy was originated by Ludwig Boltzmann (1844 – 1906) in his determination to reconcile the reversible and time-invariant characteristic of Newtonian mechanics with the time-dependent second law of thermodynamics. Boltzmann introduced the notion of time–direction on the microscopic level of particle interactions and developed the concept of statistical entropy. Compared to the original definition, statistical entropy is much less restrictive. Clausius entropy was defined as the ability to reach thermal equilibrium, with the entropy function reaching a maximum at the state of complete equilibrium. Boltzmann entropy is quite general because it is not based on any particular particle kinetics and does not require microscopic equilibrium. In statistical mechanics, a macrostate is the result of the particle interactions (‘‘collisions’’) at the microlevel, and since one cannot describe each of the possible microstates existing in a macrostate, physicists developed the criterion of degeneracy to designate the number of equivalent microstates in a macrostate. Thus, the degeneracy (W) of a macrostate is proportional to the probability pi of finding the system in that state, and denotes the number of microstates accessible to the system. Boltzmann postulated that the entropy of a macrostate is proportional to its degeneracy and may be expressed by the equation: S ¼ klnW ¼ k

X

pi lnpi

ð4Þ

P where k is the Boltzmann constant and denotes the summation of all the microstates of a macrostate. Therefore, the greater the degeneracy of a macrostate, the greater is its entropy. Since real physical systems have extremely large number of components (typically of the order of 1023 or more), it is impossible for the system to be in any state other than its most probable, or most degenerate one, that is, in the one with the highest entropy. Systems evolve over time from nonequilibrium to equilibrium seeking the state with maximum W and S. Highest entropy state means that the state has a uniform distribution of particles in space, and hence highest randomness. This is why statistical entropy can be used as a measure of disorder or randomness. All later incarnations of the entropy concept were firmly bound to this notion of order/ disorder of systems. We are limited creatures that cannot recognize entropy inherently; our senses do not allow it. But as intelligent creatures, we can make abstract judgements like the degree of order or disorder of a system and hence can comprehend structural changes of the

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

333

type less-ordered to more-ordered or vice versa. As a result of scientists’ effort to quantify and measure the degree of order/disorder or randomness of systems the concept of information was brought into the entropy discussion and has become a powerful descriptor of system changes. Entropy approaches can be classified in three more-or-less overlapping groups [19]. On the macrolevel we have the macroscopic or phenomenological approaches, that include the original Clausius definition and the more recent extension to open systems proposed by the Nobel laureate Ilya Prigogine. Accounting for the interactions within systems, we find the statistical approaches that include the abovedescribed Boltzmann postulate and the insightful definition of information entropy proposed by the communications engineer Claude Shannon (1916–2001). Finally, there are dynamic or geometrical approaches, that include the classical interpretation of ‘‘time’s arrow’’ introduced by Sir Arthur Eddington (1882 – 1944) and the very recent generalization given to complex dynamical systems introduced by the Russian physicist Andrei Kolmogorov. Nevertheless, the equation for the Kolmogorov entropy contains statistical entropy, as well as informational interpretations. The common feature to all these approaches is the notion of randomness as a measuring stick of entropy. Entropy may then be defined in some different ways, and as C˛ambel [19] pointed out ‘‘there is no one best entropy definition.’’ We should use the one best suited for our purposes. For our present purpose of establishing a framework to discuss long-term socioeconomic change Shannon’s information entropy is the best suited, but some incursions into Prigogine’s and Kolmogorov’s entropy concepts will also be necessary.

8. Information entropy James Clerk Maxwell (1831–1879), with his famous thought experiment in which he illustrated the limitations of the Second Law of thermodynamics, first linked information and entropy, probably without realizing it. Maxwell in 1871 imagined a vessel filled with air and divided in two compartments, A and B, by a wall with a small hole. A being (‘‘Maxwell’s demon’’ as it came to be known) positioned inside this vessel, who can observe all the individual molecules, opens and closes this hole, so as to allow only the swifter molecules to pass from A to B, and only the slower ones to pass from B to A. Thus, A would grow progressively cooler and B progressively hotter. This experience seems to contradict the Second Law, which predicts that temperature differences cannot arise in an isolated system at uniform temperature. Maxwell concluded then that the Second Law might not be applicable to the more delicate observations and experiments made by one who could perceive and handle the individual molecules. Only in 1929 was this apparent paradox explained by Leo Szilard (1898–1964), who observed that Maxwell’s demon causes the entropy of a gas to decrease by obtaining and gaining information about its microscopic state. The argument here is that any device, animate or inanimate, that does this must generate more entropy in the process of gaining the information, than it causes the gas to lose. Thus, the combined entropy of the gas and the demon (or device) must increase with time. Szilard had demonstrated then that the process of acquiring information generates entropy.

334

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

The action of generating entropy is the same as increasing the actual randomness or disorder of the system. Thus, we can define information as a measure of the departure from maximum entropy, i.e., from complete randomness by writing: I ¼ Smax  S

ð5Þ

where I is the numerical measure of information, S stands for the actual entropy or randomness of the system, given by the Eq. (4) (Boltzmann’s postulate), and Smax stands for the greatest possible value of S under given circumstances. The difference expressed by Eq. (5) is the entropy gap corresponding to some degree of order present in the system consistent with some constraints. Information is then the amount by which the entropy of a macrostate falls short of its maximum possible value. Possible corollaries of Eq. (5) are: processes that decrease entropy generate information; processes that increase entropy destroy information. The corollaries above and Szilard’s explanation of Maxwell’s demon lead us to the notion of entropy as a measure of our ignorance of the precise details of a process, an idea that has developed successfully in the subject called information theory proposed by Claude Shannon [26] in the late 1940s. Shannon detached Boltzmann’s postulate from its original physical realm and applied it to the problem of encoding and sending messages, helping to elucidate the structure of information itself. Shannon, at that time an engineer at Bell Telephone Laboratories, was primarily concerned with designing efficient ways to deal with messages, which include the actions of encoding, transmitting, selecting, and decoding. A message is an abstract but orderly structure, formed by strings of letters or other symbols, susceptible to a variety of concrete interpretations. It can be encoded in the binary system (‘‘digitized’’) as a string of 0’s and 1’s, and each such a digit is called a ‘‘bit’’ (binary digit). The amount of information in a message is the number of bits required to represent it, that is, its length. An important characteristic in transmitting messages is that, in this context, information cannot be created in the transmission, only destroyed. If you know how much information goes into a message, you also know the most you can get out of it. But when you receive an encoded message and must decode it, you do not know a priori its content and do not know if some information was lost. That is to say that there is no a priori assurance that the information content sent will be received. From this follows the probabilistic or stochastic nature of information. You are dealing with choices, all choices being equally probable, and hence information is related to freedom of choice. But if there is freedom of choice, uncertainty exists, and you must quantify it in order to measure the information content of a message. Shannon proposed to call this uncertainty as ‘‘information entropy’’ expressed by X ð6Þ S¼ pi log2 pi bits where log2 stands for using binary units and pi represents the probability of a given outcome. The resemblance of Shannon’s equation to statistical entropy is striking, and arises from the fact both move in the same realm: randomness and uncertainty.

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

335

A short example here will be useful for our further examples and reasoning. Let us imagine that we are measuring some unknown quantity and know only that it ranges between 0 and 7. Using the binary digit system we know that the quantities can be represented by three bit numbers: 0 as [000], 1 as [001], 2 as [010], 3 as [011], . . ., and 7 as [111]. Because we do not know the exact value we must assume that each bit can be 0 or 1 with equal probability of 1/2. The probability of any three place number is pi = 1/2  1/2  1/2=(1/2)3. We then have 3 log 2 pP i = log 2 (1/2) =  3. Eq. (6) then will give that the total uncertainty is 8 S ¼  1 ð1=8Þð3Þ ¼ 8  ð1=8Þð3Þ ¼ 3 bits. We then find that the looked for value of the quantity is, say, 4 or [100]. That means that after the measurement, we have p4 = 1 and pi = 0 for all other values, and Eq. (6) gives S = 0. This result means that after the measurement we have gained 3 bits of information, or in other words, through the measurement we reduced our uncertainty (or information entropy) from 3 bits to 0. The importance of the above example is the fact that through measurement information is gained by a given amount and the entropy is reduced. Entropy and uncertainty are the same and information is the reduction of uncertainty. Hence, it is the reduction of entropy that is expressed by Eq. (5). Shannon’s conceptualization of information as expressed above has found a broad range of applications and has gained some philosophical stature. Its appeal comes perhaps from the choice of the bewitching word entropy to designate uncertainty, a fact to be attributed to John von Neumann who is reported to have persuaded him: ‘‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage’’ [27].

9. Information, structure, and knowledge So far in constructing our cybernetic framework we have spoken about the interplay of four closely related concepts: structure, information, learning, and entropy. Structure and information form the basis of our framework, considering that system structure arises from information exchange. Learning and entropy give us the tools for studying the cybernetic mechanism, the system dynamics, with learning related to the information transfer and entropy the essence of the variable related to the uncertainty in the information content. To complete our framework, we must speak about meaning and knowledge. From a purely technical point of view, in Shannon’s information theory, information does not imply ‘‘meaning.’’ The semantic aspects of communication are irrelevant to transmitting messages. But transmission is just one manifestation of the information process, that includes also acquisition, storage, and computation, and in these matters the ‘‘meaning’’ is relevant. Through acquisition, storing, and computing information becomes knowledge, that can be well defined as information integrated and fitted into a context. It is knowledge that embodies structure. To best illustrate our reasoning we must focus our attention on language, the original and most common means of imparting information. As Haken [28] stated, languages are the order

336

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

parameters slaving the subsystems that are human beings and change only little over the life span of an individual. After his birth, an individual learns a language, i.e., he is slaved by its structure, and for his lifetime contributes to its survival. The English language has 26 letters and a space for separating words, a total of 27 symbols. If all were equally likely to occur, we would have for each printed symbol an information content of log2 27 = 4.76 bits. This number expresses the maximum entropy of a string of symbols, with no meaning. But a stream of data in ordinary language is less than random; each new bit is constrained by the bits before and by a certain set of rules. Everyone will agree that letters are not equally likely to occur, so that their pi are not all the same. In English, every word must have a vowel, implying that vowels occur with greater frequency than consonants. While pi for some letters is greater than 1/27 = 0.037 (fe. pe  0.1), the pi of others are lower (fe. pb  0.01). This results in a lowered entropy and hence in a lowered information content for any message. In English, this results in an average entropy value of 4.0 bits per letter. An important remark is that the universal characteristics of languages are reflected by the fact that this entropy value is nearly the same for all languages [29]. Portuguese, which has different pi values from English (fe. pe  0.14), also has S  4 bits per letter. Thus, we have a difference DS ms = 0.76 bits (0.64 bits for Portuguese, because of the dropping of k, w, and y, but with c˛ added, thus working with 25 symbols including the space) that accounts for a minimum structure of the system, resulting from a necessary nonuniformity. The different probability of occurrence of letters is just one aspect of the departure from maximum entropy. Three other aspects contribute to the further lowering of the maximum value. S  4 bits assumes that all letters are independent, that is, there is no correlation between their occurrence. The first aspect is that, due to rules of spelling, certain combinations are more probable than others. For example, if the letters were independent, we would have the combinations ‘‘eb’’ or ‘‘be’’ to be equally likely, with a probability peb = pbe = 0.1  0.001 = 0.001. Instead, because ‘‘be’’ is a common root and a word with a meaning, the actual values are pbe = 0.03 and peb = 0.0001. The second and very important aspect is that there are higher order levels of coordination in language. These are the rules of spelling and grammar (that must also include the symbols for punctuation, in addition to the 27 symbols), necessary to interpret what is read or written. Only with these rules can a message have a meaning. The third aspect is redundancy, which exists to facilitate the smoothness of our understanding (the case for instance of the article ‘‘the,’’ which is nearly always superfluous). Thus, correlation, grammar, and redundancies represent a further lowering of the maximum entropy, which in English results in an average information content of 1.5 bits per letter. The reduction DS ss = 2.5 bits represents the strength of the structure, which is necessary for comprehension and interpretation of any message. This result is quite similar for different languages. In Portuguese, we found a value of 1.52 bits in analyzing the information content of a poem [30]. The quantity DS si = DS ms + DS ss is the total information stored in the structure, and in the case of language represents the knowledge base of a language. The remaining 1.5 bits is a measure of the variability, the true information, and accounts also for differences of writing style and creativity. It is the room for poetry and innovation. We can speak of Ia = Smax  DS si as the active information, or making use of our available 1.5 bits, the ‘‘leeway’’ information,

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

337

Fig. 4. Cybernetic framework representing the departure from maximum entropy of a system due to information components.

in the sense that this available information functions as the ‘‘margin of maneuver’’ for mutation and change. The main point of this discussion is that the departure from maximum entropy as expressed previously by Eq. (5) contains two kinds of information: the stored information fixing knowledge and granting the structure strength, and the active information through which evolution unfolds. The general scheme of this information distribution is depicted in Fig. 4 and will serve us as the cybernetic framework for our discussion of long-term socioeconomic cycles.

10. Entropy and learning dynamics With regard to the three questions placed previously, the first one (which structure) is already discussed. We are talking about structures arising from information exchange that form our cybernetic framework. Now we shall discuss the system dynamics that will allow us to address the two remaining questions. The schematic represented in Fig. 4 has a universal feature and can be applied to any system as a thought tool of general systems theory. Structure, information, learning, and entropy are concepts pertinent to all classes of things. One can argue that this is an oversimplification, because systems in the real world are far more complex. Indeed, because a social system is a higher complex structure, we could think in terms of several structuring

338

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

levels, with several kinds of relationships between levels and so on. But we must consider that the most complex structure we know, the universe itself, is commonly represented as an expanding circumference on a sheet paper, filled with different sort of objects. These illustrations abound in physics books. The objective is simple: to draw some basic conclusions about a macrostructure resulting from microlevel interactions within which islands of increasing order are surrounded by a sea of increasing disorder and entropy. Simple representations allow speculations about very basic principles underlying and driving complex phenomena. ‘‘Complexification’’ comes later, when we need to understand and describe pertinent details. The objective of our current discussion is not quite different, and the present framework will help us to speculate about some general principles. It is a good basis for reasoning about and interpreting socioeconomic systems. The object of our analysis is the existing worldwide socioeconomic system with its nonuniformity and strength contributing to a given established structure translated by a stored information DS si that represents the departure from maximum entropy. Let us call the gap Ia (active information) the process field or ‘‘leeway’’ field, where interactions at the microlevel take place and information is continuously processed. Although the representation of the Fig. 4 may seem a static picture, it is not. Through information exchange at the microlevel, knowledge is being generated and being added to the level above, contributing to a further strengthening of the structure. This happens at the expense of destroying information in the process field, which in turn implies a further departure from maximum entropy of the given structure. If the system were a closed one, we would have a single process, which would finish when all available Ia was exhausted. But considered as an open dissipative system we have a dynamic process, through which active information is simultaneously created and destroyed. We can think then of a moving picture where the gap Ia is not constant, but changes with time. Let us call this rate dIa/dt, resulting from the balance between (dIa/dt)C and (dIa/dt)D, where the indexes C and D stand for creation and destruction. Thus (Eq. (7)):

dIa ¼ dt



dIa dt





dIa  dt C

 :

ð7Þ

D

If there were a perfect balance between creation and destruction of information, the result would be a static picture, with dIa/dt = 0. This would be as in the case for observing language as the order parameter, which, as we have mentioned, changes only little over the life span of an individual. In a dynamic process, we can have (dIa/dt)C > (dIa/dt)D or (dIa/dt)C < (dIa/dt)D, resulting in dIa/dt > 0 or dIa/dt < 0, the first case corresponding to an increasing order (and decreasing entropy), and the second corresponding to a decreasing order (and increasing entropy). Such an entropy dynamic process could be visualized as moving picture where the gap Ia would appear oscillating in its height. Open dissipative systems exhibit such oscillatory behavior, the nature of which may be a limit-cycle or chaotic behavior. As Hilborn [31] pointed out, two apparently contradictory

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

339

statements can be found in the literature about the behavior of chaotic systems: chaotic systems create information and chaotic systems destroy information. As we shall see in the following sections, both statements are correct. As we know from chaos theory, a chaotic system is characterized by the exponential divergence of initially close trajectories in the phase space. This divergence implies an increasing uncertainty and loss of information about the system, which is said to have positive Kolmogorov entropy, that is, a positive production rate of entropy within the system. Chaotic attractors are said to be generators of information in the sense that they generate uncertainty. But generating information within the system means a loss of information for the observer outside the system. This balance between losing or gaining information, or better, between increasing or decreasing entropy, will help us to interpret the unfolding of things in the process field, but it is not enough for understanding the dynamic. To complete this analysis, we must further consider the rates of the processes taking place at the microlevel. Let us look then to Fig. 4 as representing a snapshot of the information entropy distribution of a given technoeconomic system at a given time. The microlevel interactions taking place in the process field are constantly generating new knowledge, which before being incorporated in the level above (strength of the structure), act as order parameters (unstable modes) gradually enslaving the remaining stable modes and generating a new structure. This order parameter is an emerging technoeconomic system, or technosphere, which, as we will demonstrate, is chaotic at its beginning. At this stage, information entropy increases within the system, corresponding to the innovation phase of the emerging technosphere. As we shall see in the next section, as time progresses and this new technoeconomic system evolves, the rate of entropy production changes signal, whereabouts an ordered growth process starts. As time progresses, the process reaches its ceiling and a new technosphere emerges, in an ever-repeating oscillatory process. In other words, during the entire process, the information about the system as expressed by Ia will oscillate, first decreasing and then increasing, completing an entire life cycle. To demonstrate this, and to describe the unfolding and duration of this life-cycle and oscillatory phenomenon, we must look at the rates of entropy production and information exchange in the active information field. To respond then to the second question: ‘‘What is the discrete logistic growth variable?,’’ we must seek the suitable variable describing a socioeconomic system, and the answer is quite simple: people. The development of the logistic function expressed by Eq. (2) goes back to 1846 when the Belgian cleric mathematician Pierre Franc˛ois Verhulst (1804–1849) questioned Malthus’ exponential growth equation, and proposed that because of limited resources the rate of growth of population could not continue to be exponential. This concept was further developed for the rate equations of competing species in 1925 by the American biophysicist Alfred J. Lotka (1880–1949) (dealing with autocatalytic processes) and in 1926 by the Italian mathematician Vito Volterra (1860–1940) (regarding fish catches in the Adriatic Sea). The American zoologist Raymond Pearl (1879–1940) definitively established the usage of the logistic equation for population studies. Individuals are the ‘‘interactors’’ processing and exchanging information in the active information field, dissipating energy, generating knowledge, and learning to deal with the new environment corresponding to an emerging technoeconomic system. A simple and

340

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

suitable macroscopic variable to describe the system is then the increasing number N of individuals economically participating in the new system, consuming and using the new set of basic innovations, and being integrated in the emerging production system. This increasing number of individuals is not properly due just to an increasing world population. In the time frame of a long wave, comprising approximately two effective generations, the rate of this increase (presently  1.4%/year) although contributory is irrelevant if compared to the rate of people adopting the innovations and entering the new economic structure. It will be useful to further development in this paper to distinguish between N + (individuals entering the new system, by birth or adoption) and N  (individuals leaving the old system, by death or abandonment). We can consider then a transfer rate of people abandoning the previous technoeconomic system and entering the new one, being incorporated in new jobs and new forms of organization. The change (N + , N  ) is then a cooperative effect. For a first approach let us consider the variable N as the increasing population filling the niche M of a new technoeconomic system or technosphere, resulting from people that leave the previous system plus youthful individuals entering the production and consumption system for the first time. More precisely, we will consider the fraction f = N/M as our growing variable translating the expansion of the new technosphere and carrying within it all other possible economic or physical variables. As already shown by Devezas and Corredine [8] and commented earlier in this paper, this growth is driven by the product of the control parameters dtG, where d, the diffusion-learning rate, is the cognitive biological determinant, the rate at which individuals collectively learn to deal with the new environment, and tG is the effective generational determinant consisting of biologically based rhythms. We have further demonstrated that there is strong empirical evidence that d functions as an aggregate learning rate for basic innovations and leading technologies in the order of 0.13–0.16/year. The parallel with language and the use of the general scheme of Fig. 4 serves in our analysis of the socioeconomic system. If all individuals were completely independent and/or completely equivalent we would have a maximum entropy Smax = ln M (we drop the base 2 logarithms, since we are not working in a binary system, and work instead with the more universal natural logarithms used in Boltzmann statistical thermodynamics). Due to the very human and natural nonuniformity (the simplest one could be leaders and followers), social differences, and a series of structuring factors (organizations, nations, social and economic rules, etc.), a departure from maximum entropy can be subsumed as a total stored information DS si, which leaves a ‘‘leeway’’ information field humankind has for interaction, innovation, and evolution. Noteworthy in this framework is that, as in the case of our analysis of language, the relevant are not the individual characteristics, but the mutual interactions and the rules and constraints driving them. As Feynman et al. [32] once insightfully stated, order is a memory of conditions when things started. This statement can be well applied to an emerging technoeconomic system soon after its initial entrenching. The natural one-way irreversible path toward the future is that of decreasing order, which in our present case can be seen as the result of the increasing number of ‘‘interactors’’ contributing to increase the ‘‘degeneracy’’ of the system, or in other words, to increase the probability of microstates accessible to the system. As we have seen, this

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

341

probability increase is a measure of uncertainty and loss of information about the system. Considering that we have a growth process filling a finite niche, the key question, related to the third question placed previously, is then: Why does a new technoeconomic system emerge even before the previous system has reached its ceiling? To highlight this last question, we must combine the logistic growth of the number of individuals and the statistically distributed entropy change of the system.

11. Logistic growth and entropy production As already pointed out, an increasing number of individuals (‘‘interactors’’) means an increase in the probability of microstates (degeneracy) accessible to the system. This implies an increase in Shannon’s uncertainty or information entropy for each individual entering the system, which in turn leads to a decrease of our information about the system following Eq. (5). Thus, our reference frame is that of an observer gaining or losing information about the system. Each individual entering the system contributes to entropy by Si =  ln fi, where fi = Ni/M expresses the probability of each entrance. Hence, the entropy for the whole system at a given size of the system is given by: Sð f Þ ¼ hSð fi Þi ¼ f lnf ¼ lnð1=f Þ

ð8Þ

where the brackets mean that we average the sum over all individuals and f is taken from Eq. (2) using: f ¼

N 1 : ¼ M 1 þ edðtt0 Þ

ð9Þ

For the sake of simplicity, we will rewrite Eq. (9) as: f ¼

1 1 þ bedt

ð10Þ

where b = edt0 can be determined from the initial value f (0) = 1/1 + b for t = 0. A hypothetical S-shaped long wave for a typical growth rate d = 0.16 is depicted in Fig. 5. The entropy function expressed by Eq. (8) and appearing in the Fig. 6a has the important geometrical property of presenting a maximum at f = 1/e. Fig. 6b presents the first derivative of this function, where we can note that dS/df >0 for f < 1/e and that dS/df < 0 for f >1/e. This derivative expresses the rate of entropy change at each entrance. Now let us examine the rate of entropy change as a function of time, a mathematical function that lead us to the concept of Kolmogorov entropy, that is the rate of change of entropy as the system evolves. As we have seen information is lost as the entropy increases. The Kolmogorov entropy is the measure of information change in phase space. To perform its calculation the phase space

342

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

Fig. 5. Hypothetical S-shaped long wave for a typical growth rate d = 0.16.

is divided in a large number of cells of size L and measurements are taken at time increments t. The simplest definition of the Kolmogorov entropy is [31] (Eq. (11)): Kn ¼

1 ðSnþ1  Sn Þ t

ð11Þ

that is, Kn is the rate of change in going from t = nt to t=(n + 1)t. When working in phase space, the Kolmogorov entropy K is determined by calculating the limit of this function for t ! 0 and L ! 0 taken over all the number of cells in phase space. This approach permits us to determine if the initial trajectories diverge or not in phase space as the system evolves. A positive K signals that the trajectories diverge exponentially, what is, as previously analyzed, a condition of chaotic behavior. For our present purposes, we can consider the increment Sn ! Sn + 1 as the individual (discrete) contributions to entropy and average the sum of Kn over all individuals, executing as (Eq. (12)):   dSð f Þ K¼ : ð12Þ dt

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

343

Fig. 6. (a) Dependent of the entropy function (normalized) on the value of the probability f as expressed by Eq. (8). (b) First derivative of the entropy function represented in (a), which expresses the rate of entropy change as a function of the probability f.

344

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

Fig. 7. Rate of entropy change (entropy production) as a function of the logistically growing probability f for different values of the growth parameter d.

This derivative (Eq. (13)) can be obtained from Eq. (8): d df ½ f lnð1=f Þ ¼  ½1 þ lnf : dt dt

ð13Þ

For the logistic growth, we have from the Verhulst equation (Eq. (1)) df/dt = df(1  f ) and hence: K ¼ df ð1  f Þð1 þ lnf Þ

ð14Þ

The behavior of K for different values of d is depicted in Fig. 7 as a function of f. As we can see K > 0 for f < 1/e and K < 0 for f >1/e. Moreover, the value of d influences the amplitude of the function, that is, it amplifies or reduces the intensity of the rate of the change of the entropy (entropy production). Before drawing conclusions based on this result, let us examine the behavior of S( f ) and K as a function of the time, or better, the behavior of their temporal evolution. Combining Eqs. (8) and (10), we obtain Eq. (15): SðtÞ ¼

lnð1 þ bedt Þ : 1 þ bedt

ð15Þ

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

345

Fig. 8. Temporal evolution of the information about the system as expressed by Eq. 15 for the hypothetical long wave represented in Fig. 5. The shadow gradient represents the gradual transition from an ordered (dark, lower entropy) state to a disordered (light, higher entropy) state.

Combining Eqs. (14) and (10), we have for the entropy production: K¼

dbedt ð1 þ bedt Þ2

½1  lnð1 þ bedt Þ :

ð16Þ

Assuming that Ia = ln M  DS si =  S(t) we obtain for the temporal evolution of the information about the system the curve shown in Fig. 8 (normalized) using d = 0.16, that is, the unfolding of Ia for the hypothetical S-shaped long wave represented in Fig. 5. Fig. 9 shows the temporal evolution of K for the same hypothetical long wave. It is noteworthy to observe that all these results concern the behavior of the entropy function and entropy production of a logistic growth process, as a consequence of imposing the logistic equation as representing the probabilistic function. There are two main points to make that stand out from this set of equations and graphs, arising from the mathematical property of the probabilistic function (Eq. (8)) depicted in Fig. 6a: 1. there is a clear turning point in the entropy growth and in the rate of entropy change at f = 1/e  37%; 2. the rate of entropy change is positive for f < 1/e, peaks at f  12%, and turns negative for f >1/e, peaking again at f  72%.

346

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

Fig. 9. Temporal evolution of the rate of entropy change (entropy production) as expressed by Eq. 16 for the hypothetical long wave represented in Fig. 5. The shadow gradient here represents the gradual transition from an entropic region (positive entropy production) to a negentropic region (negative entropy production).

Based on these two observations, let us examine and speculate about the possible dynamics of the unfolding of a long wave, keeping in mind that our framework is that of an existing ‘‘leeway’’ field of available information, within which information is generated or destroyed in the course of a logistically growing number of individuals entering the technoeconomic system.

12. The nonlinear dynamics of a long wave From the first point above we see that the process of entropy change is not one of steady growth, but includes a turn to diminution after peaking at f = 1/e. This strongly suggests the existence of a threshold value in the number of ‘‘interactors,’’ above which widespread cooperation emerges giving origin to new kinds of structures or substructures within the system. It is well known from mathematical games that the existence of such threshold values

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

347

are related to the ratio between the number of entities and links among them, as in the case for the ‘‘random graphs’’ described by Kaufmann [33]. A random graph consists of a set of dots, or nodes, connected at random by a set of lines, or edges. One begins choosing randomly two nodes and connecting them with lines, and continues progressively to do this. At first, one observes that he will certainly connect nodes that were not connected before. After a while, however, he is likely to connect a pair of nodes and find that one node of the pair is already chosen for connection, and begin to form clusters of three interconnected nodes. In short, as one continues to choose random pairs of nodes to connect with a line, the nodes start becoming interconnected into larger clusters. As the ratio of lines to nodes continues to increase, the size of these clusters of nodes tends to grow, and obviously, as clusters get larger, they begin to become cross-connected. The important feature in random graphs is the very regular statistical behavior as one tunes the ratio of lines to nodes. In particular, a kind of ‘‘phase transition’’ occurs when this ratio passes 0.5, the point at which a ‘‘giant cluster’’ suddenly forms. This simplest example is clear evidence that with an increasing number of ‘‘interactors’’ (that is nodes plus lines), a threshold mark exists above which a new structure emerges. Kaufmann further points out that the curve ‘‘size of the largest cluster’’ versus the ‘‘ratio lines/nodes’’ is an S-shaped curve, and states that this is the kind of phase transition that led to the origin of life [33]. In our case, this can be interpreted as follows. At the beginning of the process, as new individuals enter the system, entropy increases implying a progressive information loss about the system, as translated in Fig. 8. After overcoming a threshold value corresponding to f  37%, emergent unstable structures enslave the remaining stable modes and consolidate, giving rise to a widespread new structure, which will grow thereby decreasing uncertainty and diminishing the overall entropy. Overcoming this point corresponds to entering the quasilinear stretch of the sigmoid curve. The point f = 1/e means a threshold of stability, which as we will see acts as a transitory point attractor for the whole system. But it is from the second point, regarding the behavior of K, that the most interesting conclusions and details about the underlying dynamics can be drawn. As we have seen through Figs. 7 and 9, the rate of entropy change presents a ‘‘wavy’’ aspect, being positive for f < 1/e, peaking at f  12%, then becoming negative for f >1/e, peaking again at f  72%. This implies that the entropy production during a logistic growth process presents a fourphased behavior. The positive portion of the function K means that the logistic growth process is chaotic at the beginning, with a maximum at 12% of the complete growth. The process then leaves the chaotic domain at 37% of complete growth and smoothes out, corresponding to the quasilinear portion of the logistic curve, and after 72% of the complete growth again reaches an instability point, seeking the direction of the chaotic domain, not reaching it due to the depletion of the growth process. This result matches in some measure the observations of Modis and Debecker [34], who have found, putting the solution of the logistic growth curve into a discrete form, that chaos-like states associated with instabilities in population growth arise at the beginning and at the end of the growth process. The four-phased behavior pointed out above has already been suggested by De Greene [35], who used a similar ‘‘information’’ approach, but without a mathematical demonstration.

348

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

He stated ‘‘The overall Kondratiev system would evolve in the following manner: from chaotic attractor to limit-cycle attractor to point attractor to limit-cycle attractor to chaotic attractor, with successive growth of exhaustion of information and innovation.’’ This statement matches as well the behavior of the entropy function and entropy production depicted in the graphs of Figs. 7, 8, and 9, as we will try to describe below. It is important to point out that most descriptions of long waves analyze their unfolding considering the overlapping of two succeeding waves. We did it when introducing the ‘‘Generational-Learning Model’’ in Ref. [8], considering that the conditions to innovate and to start a new wave originate during the recession-depression phase of the preceding and exhausted technoeconomic system. The same overlapping approach was used by De Greene [35], writing: ‘‘It is tempting, therefore, to divide the Kondratiev cycle/structure into four epochs that are offset somewhat from the four phases, with chaotic attractors associated with innovation in the late depression and recovery phases, limit-cycle attractors and oscillatory behavior associated with both the early prosperity and late recession phases, and point attractors and equilibrium-seeking behaviors characterizing the late prosperity and early recession phases around the inflection point of the logistic curve.’’ What is new in the present discussion is that we do not need to consider the overlapping. The specifics of the unfolding process, with the threshold limits and the ‘‘necessity for renovation,’’ are completely contained in the presented mathematical approach, using the proposed cybernetic framework. We can then combine this framework with the ‘‘Genera-

Fig. 10. (a) Information life cycle for the hypothetical long wave depicted in Fig. 5. (b) Four-phased behavior during the two-cycle unfolding of the hypothetical long wave depicted in Fig. 5.

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

349

Fig. 10. (continued )

tional-Learning Model’’ represented in Fig. 3 and De Greene’s description, dividing the entire process into two cycles, each of which is divided into two phases: 

Innovation cycle: Corresponding to K>0, peaking at f  12%, presenting two phases. Phase I (for f < 12%) with a chaotic attractor and increasing entropy production (and hence an increasing rate of information loss), and the system exhibiting wider fluctuations, with the appearance of innovators and technological and social innovations. This is the period of the very emergence of a new technoeconomic system, pushed by the ‘‘unlucky generation’’ of new knowledge innovators (see Ref. [8] for detailed analysis of the generational succession). Making use of Schumpeter’s

350



T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

conception, we can say that this initial increase in the number of ‘‘interactors’’ translates as the emergence from outside the established structure of the ‘‘new men.’’ Then follows Phase II (for 12% < f < 37%), with a limit-cycle attractor and decreasing entropy production, and the system ‘‘selecting’’ the innovations and innovators that will prevail. Fluctuations diminish in intensity as the ‘‘selected’’ innovations and innovators, generating the unstable modes, begin to develop new structures and enslave the remaining stable structures. The system then crosses a point attractor, corresponding to the threshold point f = 1/e  37% (not exactly matching the inflection point of the logistic curve as suggested by De Greene, but shortly before), and starts a Consolidation cycle: Corresponding to K < 0, peaking at f  72%, presenting itself also two phases. Phase III (37% < f < 72%), with a limit-cycle attractor and continuing decrease of entropy production. This is the period of the real entrenching of the new technoeconomic system and the new prosperity, lead now by the ‘‘lucky generation’’ of knowledge consolidators [8] who succeed the innovators generation. Disorder and entropy are decreasing, and hence information about the system is increasing, as the new cognitive space is explored. Knowledge is being incorporated into the system’s structure, gradually reducing the ‘‘leeway’’ field of available information. The point f = 72% marks a new threshold, when the entropy production ceases to diminish and reverses direction, beginning to grow again. Phase IV ( f >72%) starts with a new kind of instability, signaling the depletion of information in the ‘‘leeway’’ field and most important: signaling the necessity of renewal, of regenerating the ‘‘leeway’’ information necessary for maintenance and evolution of the system. The number of ‘‘interactors’’ has reached saturation in filling the finite niche, whose ceiling is imposed by the finite quantity of available information.

The answer to the third question placed previously (‘‘Why does a given structure disappear and get replaced?’’) is then the need for replacement of available information, the natural raw material ‘‘interactors’’ need to continue their collective evolutionary path. Despite being a region of instability, the attractor in action is a point attractor, the depletion of the process. De Greene’s chaotic attractor or Modis’ chaos-like behavior at the end of growth is not due to the intrinsic depletion of the growth process, but due to the emergence of a new technoeconomic system, bringing the necessary fresh information (through innovations and innovators). Fig. 10a and b summarizes the general scheme described above for the same hypothetical long wave depicted in Fig. 5. To finalize this analysis, it is important to consider the important role played by the logistic rate constant d in the behavior of the entropy production rate K and shown in Fig. 11. This figure illustrates for comparison the behavior of K for long waves with four different values of d, namely 0.3, 0.2, 0.16 (our hypothetical long wave), and 0.10. We have seen in Fig. 7 that the value of d influences the amplitude of the function, amplifying or reducing the rate of change K. As we can see from Fig. 11, the value of d (in its action on Eq. (16)) not only influences the amplitude, but shortens or stretches the length of the curve. As already discussed, through the relationship between the continuous and discrete logistic equations, we have shown [8] that the product of the control parameters d (the diffusion-learning rate or

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

351

Fig. 11. Temporal evolution of the rate of entropy change (entropy production) as expressed by Eq. 16 for different values of the control parameter d.

cognitive determinant) and tG (the effective generational determinant) must be kept in the range 3 < dtG < 4, the deterministic chaos domain, in order to grant the necessary oscillations and survival of the system. From the pure mathematical behavior of a logistic curve, we know that the numerical value of the rate constant d coincides with the value of the real growth rate at the region 10% < f < 12%, that is, the region at the maximum of chaotic behavior. In other words, d represents numerically the value of the real growth rate when the system is under the action of the chaotic attractor. Considering that tG is about 25 years, the diffusion-learning rate must range between 13% and 16% to match the interval 3 < dtG < 4. Observing the curves in Fig. 11, we see that d = 0.16 is the correct value for a long wave with an overall duration of 55–60 years. This is further evidence that the biological determinants d and tG are the real control parameters of the whole process. As tG is a human biological rhythm of about 25 years, the rate d cannot be very different from 0.16. If greater (see the curves for 0.2 and 0.3), this would imply in an overshoot in entropy production and reaching the threshold point ( f = 32%) in a time shorter than a generation interval. If smaller (curve for 0.1), the intensity of entropy production is very low and the time for reaching the threshold point would be much bigger than a generation interval. Applying the rate parameter k = dtG for the logistic recursive equation (Eq. (3)), if d>0.16, we have an overshoot of the chaos upper limit k = 4, implying breakdown of the system; if d is very small, allowing values of k < 3 we have convergence to a fixed point and then collapse of the system due to a paralyzing equilibrium. The value of d of about 0.16 establishes the time to reach the threshold point as coincident

352

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

with the length of a generation. This is the biologically controlled rate of transfer of information among humans at the aggregate level, which well matches the empirical data for the diffusion rate of basic innovations in the body of society as already pointed out by Devezas and Corredine [8]. Concluding, the existence of the long-wave phenomenon can be explained through the life cycle of information being created and destroyed by an increasing number of human ‘‘interactors’’ exchanging and processing information in a finite ‘‘leeway’’ field. The timing and duration of the phenomenon is determined by biological control parameters acting at the micro- and macrolevel. At the microlevel, by driving the rate of exchange of information among humans, and at the macrolevel by constraining the rate of transfer of information between successive generations.

13. Further considerations Our cybernetic framework is intended as a consistent theoretical basis for discussion and further modeling of recurrent phenomena in the social realm. More concretely, we chose one of these recurrent phenomena, namely long waves in socioeconomic growth, commonly labeled as K waves, and are proposing to describe their unfolding by looking at the time dependent rate of information entropy change due to an increasing number of interacting individuals entering the technoeconomic system. A technoeconomic system, or technosphere, is to be understood as an order parameter consisting of a given ‘‘swarm’’ of basic technological innovations that emerges in society at a given time, originating new activities, new jobs, new industrial branches, and introducing new habits in society as a whole. The time evolution of such a technoeconomic system can be described discretely by the logistically increasing number of ‘‘interactors’’ adopting this emerging set of new things: innovations, jobs, activities, habits, etc. . . .. Using then the logistic function as the probabilistic distribution of individuals exchanging and processing information in a finite niche of available information, it is possible to demonstrate that the rate of information entropy change exhibits a ‘‘wavy’’ aspect evidenced by a four-phased behavior denoting the unfolding of a complete long wave. This four-phased behavior contains a chaotic regime at the beginning, a limit-cycle-topoint attractor-to-limit-cycle regime in the middle part, and finishes as a point attractor consisting in the exhaustion of available information, all comprising an entire information life cycle. This dynamic process can be divided into two cycles, an innovation cycle and a consolidation cycle, separated by a clear threshold point, marking the transition from a disordered (growing information entropy) regime to an ordered (decreasing information entropy) regime. This transition coincides with the end of the recession–depression phase and starting of a recovery phase of a Kondratieff wave. Our mathematical result seems to match very well with other models and descriptions of long waves found in the literature and the factuality of the empirical evidence accumulated in the observation of this phenomenon. Another conclusion to be drawn from the present theoretical framework is that the technoeconomic system is not a purely chaotic system. The chaos-like behavior appears

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

353

only at the first phase of the process, where an increasing entropy production is observed. The resultant of the dynamic four-phased behavior of a long wave is a limit-cycle behavior driven by the exhaustion and the necessity for replacement of available information. Two biological control parameters, a cognitive and a generational, set the pace of information exchange and information replacement, resulting in a strict cycle behavior with a period of 50–60 years. The extreme stability in the patterned regularity observed for the last two and a half centuries (or throughout the millennium as suggested by Modelski and Thompson [14] and Modelski [15–17]) is a strong evidence that we are indeed confronting a limit-cycle behavior. In the present mathematical handling of the logistic function expressed by Eq. (2) we consider only the increasing number N of individuals economically participating in the new system. An alternative and the possibility of further improvement of this approach would be to consider a rate of transfer of people abandoning the previous technoeconomic system and adopting the new one. The old system could be then described by a decreasing number N  of people leaving it by death, retiring, or adoption of the new system, and the new system could be described by the increasing number N + of people entering it as a first adoption (young people) or entering it after changing sides. With regard to the description of the old system losing people, it is worthwhile to mention the interesting results obtained by Carvalho-Rodrigues [36] and Carvalho-Rodrigues et al. [37] in assessing a system’s degradation due to fatalities in combat and in epidemics, respectively. Carvalho-Rodrigues [36] was the first to make use of the asymmetric property of the Shannon’s entropy function (as depicted in Fig. 6a), but the other way around, looking at the loss of cohesion of an army due to fatalities and loss of social cohesion due to epidemics. Making use of the concept of cohesive entropy, he proposes a cohesion function complimentary to informational entropy, with which it is shown that there is a breakdown in the system’s cohesion when the fatalities reach the fraction f = 1/e  37% of the initial number of components. In both publications, the authors point to a broad agreement with historical data from past battle experience and mortality in plagues in middle age Europe. We believe that the same approach could be used in analyzing the loss of cohesion of a technoeconomic system due to a diminishing number of members simultaneously with the emergence of a new technoeconomic system with an increasing number of members. Such an approach could better describe the overlapping of two successive waves. Perhaps the most important question to be addressed in this context is, why are things so? Why the observed patterned behavior in human affairs, as depicted in Fig. 1? Or why the successive depletion and replacement of information, repeating at regular intervals? We are dealing most probably with the grand scheme of things and perhaps we can shed some light in these terms. At the present stage of its evolution, the universe (or perhaps better, our present knowledge about it) is an expanding system with increasing overall entropy. An outlook shared by modern cosmologists, as pointed out by Layzer [38], is that the ultimate cause of order in the universe is the cosmic expansion that gives rise to two distinct kinds of order, chemical order and structural order. But attaining order is a very complex process that demands efficient strategies to realize it in such a way that the overall entropy continues to grow. One of these strategies is to stop entropy growth temporarily in some restricted areas, within which energy is dissipated at higher rates. This is the efficiency strategy. Local order

354

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

generates great disorder in the environment. Dissipative structure theory considers energy dissipation as the driving force for evolution of open systems, that is to say, open systems evolve and maintain structure by exporting entropy to their environment. This is the fundamental strategy of the universe that led to the origin of life: replicating ordered structures is the most efficient form of increasing the entropy of the surroundings, since they are high-negentropy consumers. In this corner of the known universe, the human brain is the most efficient negentropic machine. Social systems, which represent the collective efforts of human beings, mimic organic systems in the strategy of stopping entropy growth locally and temporarily. They can only do so, however, by hastening the entropy of the natural environment at large. Prigogine’s entropy balance equation can help here to understand our reasoning [39]. Whenever there is a flow, some kind of change occurs, physical, chemical, or socioeconomic. These changes are described by the so-called equations of change, or balance equations. For a system exchanging entropy with its surroundings, the net entropy production dS/dt within the system may be expressed by: dS dSI ! ¼ þ r  Js dt dt

ð17Þ

where dSI denotes the entropy change within the system and J¯s is the entropy flux, that is, the entropy crossing the boundaries per unity time (it is a vector quantity and hence its dot product with the vector operator nabla yields a scalar quantity). rJ¯s may be positive, negative, or zero. It would be positive if entropy enters the system, negative if entropy is discharged to the outside, and zero if the system is isolated (or reversible). As we can see, in order to arrive at rJ¯s < 0, dS/dt must be negative and |dS/dt|>|dSI/dt|. This means that in order to effectively discharge entropy to the surroundings, the net entropy change inside the system must be negative. It is important to remember that Eq. (17) is a thermodynamic equation and does not have a direct equivalence to the information entropy equations we treated before. But the comparison is valid and we can use it to make dS/dt above correspond with our K. Following the reasoning above, we can say that K < 0 is the most effective way to discharge entropy into the environment, and in fact, observing Figs. 7 and 9, we see that this happens during almost two thirds of the entire process. But on the other hand, K cannot be negative all the time, because a finite amount of information is being ‘‘consumed’’ and incorporated in system’s structure. Hence, renewal is necessary, new information must be ‘‘generated,’’ and happens at the expense of having K > 0 and possibly a lower rJ¯s. This cyclic behavior is part of the nature’s efficiency strategy.

14. Conclusions Considering a technoeconomic system as an order parameter consisting in a given swarm of basic technological innovations and describing its time evolution as a logistically growing

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

355

number of interacting individuals entering the system, and exchanging and processing information in a active field of available information, it is possible to demonstrate that the rate of entropy change within the system is a four-phased dynamic process. The entire unfolding process can be divided into two cycles, an innovation cycle and a consolidation cycle, separated by a clear threshold point at a fraction f = 37% of the complete growth. This threshold point may be interpreted as a transition from a disordered to an ordered and cooperative growing process, and corresponds to the transition between the depression phase and the recovery phase in the classic descriptions of long waves. The four-phased behavior contains a chaotic process at the beginning, a limit-cycle-to-point attractor-to-limit-cycle in the middle part, and finishes as a point attractor consisting in the exhaustion of available information, all comprising an entire information life cycle. Another important threshold point appears at f = 72% of complete growth signaling the depletion of information in the active field and the necessity for renewal by replenishing the information necessary for maintenance and evolution of the system. At this point, the conditions for the emergence of a new technoeconomic system are created bringing forth the necessary fresh information in the form of a new set of basic innovations. As a whole, the present theoretical analysis suggests that the technoeconomic system is not a purely chaotic process, but exhibits a limit-cycle behavior manifest in the phenomenon of long waves, whose basic mechanism is the periodical deployment and filling of information in the active field or ‘‘leeway’’ field. The pace of the process, and hence the duration of the long wave, is determined by two biological control parameters, one cognitive, driving the rate of exchanging and processing information at the microlevel, and the other generational, constraining the rate of transfer of knowledge (information integrated into a context) between successive generations at the macrolevel. Moreover, it is speculated that social systems mimic living systems as efficient negentropic machines, and making use of Prigogine’s entropy balance equation for open systems it is suggested that its cyclical behavior is probably the best way to follow nature’s efficiency strategy. The proposed cybernetic approach constitutes a purely theoretical framework and it is our hope it will serve as a basis for further modeling and empirical work in the research of recurrent phenomena in the social realm. The understanding of the underlying mechanism of long waves, and their mathematical description as here presented through the rate of entropy change is a new contribution and a necessary condition for using such periodical social phenomena as efficient forecasting tools for futures studies and comprehension of past historical facts.

Acknowledgments Tessaleno Devezas thanks the Portuguese Fundac˛a˜o de Cieˆncia e Tecnologia, which partially supported (through the Unity of R&D No. 202) the present investigation, and to the Fundac˛a˜o Luso-Americana for a travel grant to present a re´sume´ of the present paper at the 21st International Symposium on Forecasting in Pine Mountain, GA, USA, June 17–20, 2001. Both authors thank physicist Nick La Rocca for his personal help and counsel.

356

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

References [1] J.S. Mill, On the definition of political economy, in: J.M. Robson (Ed.), Collected Works. Essays on Economy and Society, vol. 4, University of Toronto Press, Toronto, 1967. [2] E.N. Lorenz, Deterministic nonperiodic flow, J. Atmos. Sci. 357 (1963) 130 – 141. [3] T.Y. Li, J.A. Yorke, Period three implies chaos, Am. Math. Mon. 82 (1975) 985 – 992. [4] R.M. May, Simple mathematical models with very complicated dynamics, Nature 261 (1976) 459 – 471. [5] N.D. Kondratieff, Die Langen Wellen der Konjuntur, Arch. Sozialwis. Sozialpolitik 56 (3) (1926) 573 – 609. [6] G. Mensch, Stalemate in Technology, Ballinger, Cambridge, MA, 1979. [7] T.C. Devezas, Kondratiev waves revisited, Global Futures Bull. 41 (1997) 5 – 7. [8] T.C. Devezas, J.T. Corredine, The biological determinants of long wave behavior in socioeconomic growth and development, Technol. Forecast. Soc. Change 68 (2001) 1 – 57. [9] C. Marchetti, Society as a learning system: Discovery, invention, and innovations cycles revisited, Technol. Forecast. Soc. Change 18 (1980) 257 – 282. [10] C. Marchetti, Fifty years pulsation in human affairs, Futures 17 (3) (1986) 376 – 388. [11] R. Metz, Long waves in English and German economic historical series from the middle of the sixteenth century to the middle of twentieth century, in: R. Fremdling, P.K. O’Brien (Eds.), Productivity in the Economics of Europe, Klett-Cotta, Stuttgart, 1983, pp. 175 – 219. [12] C.H.A. Dassbach, N. Davuytan, J. Dong, B. Fay, Long waves prior to 1790: A modest contribution to the study of long waves, Review XVIII (2) (1995) 305 – 326. [13] J.S. Goldstein, Long Cycles: Prosperity and War in the Modern Age, Yale Univ. Press, New Haven, CT, 1988. [14] G. Modelski, W. Thompson, Leading Sectors and World Powers: The Coevolution of Global Economics and Politics, University of South Carolina Press, Columbia, SC, 1996. [15] G. Modelski, Evolutionary paradigm for global politics, Int. Stud. Q. 40 (1996) 321 – 342. [16] G. Modelski, Democratization in long perspective, Technol. Forecast. Soc. Change 39 (1991) 22 – 34. [17] G. Modelski, World system evolution, in: R.A. Denemark, J. Friedman, B.K. Gills, G. Modelski (Eds.), World System History—The Social Science of Long-Term Change, Routledge, New York, 2000, pp. 24 – 53. [18] D. Richards, A chaotic model of power concentration in the international system, Int. Stud. Q. 37 (1993) 55 – 72. [19] A.B. C˛ambel, Applied Chaos Theory: A Paradigm for Complexity, Academic Press, San Diego, CA, 1993. [20] T. Brown, Measuring chaos using the Lyapunov exponent, chapter 3, in: L.D. Kiel, E. Elliot (Eds.), Chaos Theory in the Social Sciences, University of Michigan Press, Ann Arbor, MI, 1996, pp. 53 – 66. [21] B.J.L. Berry, Long-Wave Rhythms in Economic Development and Political Behavior, Johns Hopkins Univ. Press, Baltimore, MD, 1991. [22] B.J.L. Berry, H. Kim, Long waves 1790 – 1990: Intermittency, chaos, and control, chapter 10, in: L.D. Kiel, E. Elliot (Eds.), Chaos Theory in the Social Sciences, University of Michigan Press, Ann Arbor, MI, 1996, pp. 215 – 236. [23] B.J.L. Berry, A pacemaker for the long wave, Technol. Forecast. Soc. Change 63 (2000) 1 – 23. [24] B.J.L. Berry, Low-frequency waves of inflation and economic growth: Digital spectral analysis, Technol. Forecast. Soc. Change 68 (2001) 63 – 73. [25] F. Carvalho-Rodrigues, J. Dockery, Defining systems based on information exchange: Structure from dynamics, BioSystems 38 (1996) 229 – 234. [26] C.E. Shannon, A mathematical theory of communication, Bell Syst. J. XXVII (1948) 379 – 423, 623 – 656. [27] M. Tribus, E.C. McIrvine, Energy and information, Sci. Am. 224 (3) (1971) 179 – 190. [28] H. Haken, Synergetics — An Introduction, Springer-Verlag, Berlin, 1978. [29] J. Singh, Information Theory, Language and Cybernetics, Dover, New York, 1966. [30] C. Duarte, The informational content in the evolution of design, PhD thesis, in preparation. [31] R.C. Hilborn, Chaos and Nonlinear Dynamics — An Introduction for Scientists and Engineers, Oxford Univ. Press, New York, 1994.

T.C. Devezas, J.T. Corredine / Technological Forecasting & Social Change 69 (2002) 317–357

357

[32] R.P. Feynman, R.B. Leighton, M. Sands, Lectures on physics, Mechanics, Radiation and Heat, vol. I, Addison-Wesley, Reading, MA, 1966. [33] S. Kauffman, At Home in the Universe, Oxford Univ. Press, New York, 1995. [34] T. Modis, A. Debecker, Chaoslike states can be expected before and after logistic growth, Technol. Forecast. Soc. Change 41 (1992) 111 – 120. [35] K.B. De Greene, Field-theoretic framework for the interpretation of the evolution, instability, structural change, and management of complex systems, chapter 12, in: L.D. Kiel, E. Elliot (Eds.), Chaos Theory in the Social Sciences, University of Michigan Press, Ann Arbor, MI, 1996, pp. 273 – 294. [36] F. Carvalho-Rodrigues, A proposed entropy measure for assessing combat degradation, J. Oper. Res. Soc. 40 (1989) 789 – 793. [37] F. Carvalho-Rodrigues, J. Dockery, T. Rodrigues, Entropy of plagues: A measure for assessing the loss of social cohesion due to epidemics, Eur. J. Oper. Res. 71 (1993) 45 – 60. [38] D. Layzer, Cosmogenesis. The Growth of Order in the Universe, Oxford Univ. Press, New York, 1990. [39] I. Prigogine, I. Stengers, Order Out of Chaos, Bantam Books, Toronto, 1984.