Brownian models of stochastic flow in networks

Brownian models of stochastic flow in networks

A bsrracrs 99 -Two types of models have been developed to make such analysis possible. One type, sometimes called an adjustment process (or message ...

168KB Sizes 4 Downloads 55 Views

A bsrracrs

99

-Two types of models have been developed to make such analysis possible. One type, sometimes called an adjustment process (or message exchange process) focuses on the informational aspects of the economic phenomena and is specified by the following entities: the individual message spaces used by the participants, their response functions, and the outcome function. (The latter translates terminal or equilibrium messages into actions to be taken and/or physical outcomes such as resource allocations.) The second type of model treats the behavior patterns of the participants as determined by rules or norms specifying admissible behavior (strategy domains) and the consequences of actions undertaken by the participants (again, the outcome function). Thus in this approach an economic mechanism is synonymous with what is called a game form (i.e., strategy domains and an outcome function). Given a game form, the observed behavior is determined according to certain postulates concerning choices of actions under uncertainty with regard to the actions of other participants. The most frequently used postulate is that of a noncooperative game, with Nash equilibrium (or its refinements) as a solution. While much has been learned about mechanisms of the two types, there seems to remain a gap between the description of the workings of an economy in terms of a mechanism (say in the sense of a game form) and of institutional structures. The changes mentioned above are formulated in terms more general than those defining a mechanism, but also taking into account issues such as enforcement. Thus we are faced with the problem of how to formulate the notion of an institution (in the sense of a set of forms or rules, not of an organizational entity), and how to relate it to the concept of a mechanism. The present work is intended as a step in this direction.

Models of Stochastic Flow in Networks. J. Michael Harrison, Graduate School of Business, Stanford University, Stanford, CA 94305-5015, U.S.A. Brownian

Beginning with the work of J.R. Jackson around 1960, queueing network theory has developed rapidly in the past 30 years. It is now one of the central topics in stochastic systems theory and may even be comparable to linear programming and multiple regression in the scope and importance of its applications. The motivation for Jackson’s original work was a desire to better understand congestion and delay in jobshop manufacturing. Applications of the theory for analysis of computer and communications systems rose to prominence in the 1970s largely at the instigation of L. Kleinrock, and those applications motivated new and important theoretical developments. In the last five years there has been a renewed interest in manufacturing systems applications, and it seems likely that new theoretical developments will be spurred by the distinctive characteristics of automated manufacturing. The most influential and most highly developed portion of a queueing network theory is that dealing with what are called product-form network models. Roughly speaking these are network models built from Poisson flows, and their steady-state performance characteristics are known in the form of explicit formulas. For any

100

Abstracts

given network the formulas in question are derived from the stationary distribution of an underlying Markov chain, a stationary distribution that has a multiplicatively separable form - thus the appellation ‘product form network’. The simplest examples of product form networks are those with Poisson input processes and exponential service time distributions. One can still get a product form solution with more general service time distributions, but only when they occur in conjunction with certain special queue disciplines like processor sharing. The effect of such a discipline is that only the mean of the associated service time distribution is relevant for network performance analysis. Thus, as it turns out, the familiar and widely used theory of product form networks is a one-moment theory, in which each input source and each processor or server is characterized by a single parameter - an average input rate or average processing rate as the case may be. This means that conventional queueing network theory fails to capture a phenomenon of central importance in many applied contexts - the impact of variability on congestion and delay. To understand the role of variability, and more specifically to separate the effect of variability from the effect of overall load level, one needs a two-moment theory. Over the last 20 years there has been developed a class of queueing network models built from Brownian motions rather than Poisson processes. In these alternative models, which were originally motivated or justified as heavy-traffic limits of conventional queueing models, each input source and each processor is represented by an associated Brownian motion, and hence each is characterized by both a mean parameter and a variance parameter. The queue length and workload processes of ultimate interest are modeled as multidimensional reflected Brownian motions, also called regulated Brownian motions or simply RBMs, and the problem of network performance analysis reduces to that of computing the stationary distribution for an appropriate RBM. For certain network structures the stationary distribution has a product form, and rapid progress is being made in the development of practical computational procedures for use when such special structure is absent. Even without explicit computation, however, Brownian network theory yields conceptual insights that have real practical importance. Moreover, multidimensional RBM has proved to be fertile ground for probabilists; there is now a substantial mathematical literature dealing with questions of existence and uniqueness, semimartingale representation, analytical characterization via Ito stochastic calculus, and the characterization and computation of stationary distributions.

Chaos and J. Michael UCLA School of Medicine, University of California, Los Angeles, CA 90024, U.S.A. That simple, deterministic, non-linear dynamic systems can display complex and unpredictable behavior is the essence of chaos theory; that such systems are pervasive in nature has revolutionary implications for the sciences. If the flap of a but-