Chapter 2 Modeling and Simulation of the Coastal System V. CHRIS LAKHAN Department of Geography University of Windsor Windsor, Ontario, C a n a d a N9B 3P4
2.1
General Concepts and Terminology
To be consistent with the systems approach methodology discussed in C h a p t e r One, the field of modeling and simulation should form an interface between the real-world processes and the field of General Systems Theory (Spriet and Vansteenkiste, 1982). Early investigators attempting to simulate the behavior of various types of systems, however, have been unclear in their definitions of simulation (Tocher, 1963; Naylor e t al., 1966). Even at present, the precise meaning of the word “simulation” is a point of debate (Spriet and Vansteenkiste, 1982, p. 4). According to Oren (1984), there are more than twenty definitions of the term. Although there is no generally accepted definition, it can, nevertheless, be stated that, “the phrase modeling and simulation designates the complex of activities associated with constructing models of real world systems, and simulating them on the computer” (Zeigler, 1976, p. 3). Within this context, a simulation model can be considered as a mathematicallogical representation of a system, and then conducting experiments with the computerized system model on a digital computer. Modeling is, therefore, the integrated development of mathematical equations, logical rules and constraints, and a computer program embodying the equations, the logical rules, and the solutions to them (see Ingels, 1985). Simulation on the other hand, is the experimental manipulation of the model on a digital computer.
2.2
Why Simulation?
The principal reasons for advocating the use of simulation models to understand and predict the behavior of the coastal system lies in the fact that real world systems are too complex to permit viable models to be evaluated analytically. One of the major advantages of simulation models over analytic models is that analytic models usually require many simplifying assumptions to make them mathematically tractable, whereas simulation models have no such restrictions. With analytic models, the analyst usually can compute only a limited 17
18
number of system performance measures. With simulation models, the data generated can be used to estimate any conceivable performance measure (Schmidt and Taylor, 1970). A simulation model has several other advantages (Adkins and Pooch, 1977). They include: (a) It permits controlled experimentation. A simulation experiment can be run a number of times with varying input parameters to test the behavior of the system under a variety of situations and conditions. (b) It permits time compression. Operation of the system over extended periods of time can be simulated in only minutes with ultrafast computers. (c) It permits sensitivity analysis by manipulation of input variables. (d) It does not disturb the real system. One of the most attractive features of coastal simulation models is based on the fact that their use permits manipulation in terms of real-time of minutes which would take years for the prototype. Of the several types of simulation models which exist (see Law and Kelton, 1982), the discrete-event and continuous simulation models can be successfully used to study the coastal system.
2.3
The Simulation Process
fiom the theoretical and applied literature on simulation and modeling, it is apparent that various disciplines apply the same basic set of fundamental rules when formulating simulation models. Variations exist only in the terminology used, and the emphasis placed on each rule and procedure. The procedures and rules which are generally pursued in developing a simulation model, together with various other simulation techniques, have been outlined and discussed in a number of comprehensive studies (for example, Gordon, 1969; Sworder, 1971; Mihram, 1972; Rivett, 1972; Fishman, 1973,1978; Kleijnen, 1974; OrdSmith and Stephenson, 1975; Shannon, 1975; Karplus, 1976; Zeigler, 1976; Franta, 1977; Lehman, 1977; Graybeal and Pooch, 1980; Cellier, 1982; Law and Kelton, 1982; Spriet and Vansteenkiste, 1982; Banks and Carson, 1984; Morgan, 1984; Zeigler, 1984; Ingels, 1985; Bratley e t al., 1987 ). Without elaborating on the stages of model development and simulation techniques discussed by the aforementioned authors, it is apparent that there is scope for the development of a universal simulation design process. System investigators must begin to look at the process of simulation in a more exact and systematic way. To develop a credible simulation model of the coastal system it is, however, recommended that the simulationist or modeler (hereafter referred to as simulationist) carefully consider each of the following: (1) Problem and Objective Specification (2) S y s t e m Definition
19
(3) Model Conceptualization and Formulation (4) Data Collection and Preparation
(5) Model "ranslation (6) Program Verification (7) Model Validation (8) Experimental Design
(9) Model Results and Interpretation Although there are other stages and approaches of simulation model development for real or artificial systems (see Mihram, 1972; Shannon, 1975; U.S. General Accounting Office, 1979; Nance, 1981; Banks and Carson, 1984; Umphress, 1987), a credible coastal system, or any other natural system simulation model can, be developed and implemented when the above nine steps are carefully followed. As a complete discussion of each of these steps is beyond the scope of this chapter, the reader will be provided with only the necessary terminology, background material and relevant literature for each step. Discussions on each step will be limited to applications in coastal modeling and simulation. Despite the fact that each step will be discussed separately, verification and validation must be performed throughout the modeling process (see U.S. General Accounting Office, 1979; Boehm, 1981). (1) Problem and Objective Specification A model must not be devised, and then fitted to an operation. This is putting the cart before the horse (Morse, 1977). Once the decision has been made to develop a simulation model, the problem to be investigated and the objectives to be attained must be specified. Zeigler (1984) correctly pointed out that the components to be included in the model and the abstractions made of them are governed by the objectives. Several problems can be identified in coastal system investigations. For instance, it has been recognized that there is widespread beach and coastal retreat (see Kaufman and Pilkey, 1983). Coastal researchers, however, have not been able to solve the problem of predicting spatial and temporal beach and coastal changes. Given the fact that the problem of predicting coastal changes is unresolved, and knowing that short-term empirical studies and laboratory and theoretical investigations have been unsuccessful in providing answers to simple questions on coastal changes, then a simulation model with stochastic properties can be formulated to solve problems pertaining to coastal changes. The fundamental objective of a dynamic simulation model will be to enhance our understanding, and facilitate prediction of spatial and temporal changes occurring in the coastal system. To develop the model it is absolutely necessary to understand the nature and characteristics of the coastal system.
20 (2) System Deflnition
The coastal system is a complex, decomposable, large-scale system. As a real world system it is a composition of interacting component systems and associated subsystems. The coast, with its component systems, will display vast differences in morphology and states (Wright e t al., 1985), and also various types of equilibria (Orme, 1982). For modeling purposes, caution must be taken in demarcating, and specifying the boundary conditions of the coastal
system. Since it is unlikely that the whole coastal system will be modeled, then the general boundary conditions must be portioned into the segment of desired interest. To do this, one must understand the system environment and states. It must be recognized that at any point in time, a large number of coastal entities interact with a large variety of macro and micro morphological states to produce an apparently random spatial and temporal pattern. Changes in the state of the coastal system are also affected by a set of poorly understood entities-the components of the system-the processes that are interacting over time, in this case, the waves, tides, winds, currents, etc. Each of the entities of the coastal system has particular attributes, these being, for example, the height and period of waves, range of tides, the size and shape of material, and the velocity of currents.
(3) Model Conceptualization a n d Formulation The simulationist must first identify which properties of the real system are essential, and how many of them are required to give, in agreement with the model’s objective, a satisfactory description of the system (Nihoul, 1975). For explanation purposes, let us assume that the principal objective of the simulationist is to formulate a model to predict spatial and temporal changes occurring to a segment of the coastal system. The rationale for this objective is based on the fact that coastal changes, unpredictable in magnitude and duration, cannot be satisfactorily explained by small scale experimental studies, and short-term empirical observations. Before the model is developed, the simulationist must take cognizance of the fact that like most natural open systems, the coastal system is governed by a large number of uncontrolled and interdependent entities and their attributes, and is complicated by the presence of several feedback mechanisms, some positive and some negative, in complex feedback loops. Hence, it is difficult to identify and separate the independent and dependent entities which control a portion or the whole coastal system. Here it should be noted that, although complex systems and their environments are objective (i.e., they exist), they are also subjective in that the particular selection of the elements (entities) to be included or excluded, and their configuration, are dictated by the researcher (Shannon, 1975). It is obviously the aim of most simulationists to build a model that has high face validity. Realism must be incorporated into the model, and this requires the use of reasonable assumptions and accurate data (Naylor and Finger, 1967). In simulation terminology, there are structural assumptions which deal specifically with how the coastal system operates. The data assumptions refer to the choice of theoretical distributions, data reliability, and parameter estimates. From the study by
21 the U.S.General Accounting Office (1979), it is known that at this stage of the modeling process it is also crucial to adopt supporting theories, as for example on wave motion, wave breaking, sediment transport mechanics, etc. Without doubt, most models are formulated based on the modeler’s perceptions and understanding of the real system. Hence, to prevent the incorporation of incorrect assumptions, theories, etc. into the model, the modeler should seek the advice of professionals familiar with the real system. This should be done before the model is flowcharted and coded because it is a waste of time and effort t o develop a model which is based on unsound operating assumptions, theories, and philosophy. This author concurs with Bratley e2 01. (1987) that it is best to start with simple, modularized models. For example, a simple conceptual model, which can be eventually developed into a highly detailed model to simulate spatial and temporal coastal changes, is presented in figure 2.1. More entities and their attributes have not been selected at this stage of model development in order to avoid Bonini’s paradox which Dutton and Starbuck (1971, p. 14) describe as “a model is built in order to achieve understanding of an observed causal process, and the model is stated as a simulation program in order that the assumptions and functional relations may be as complex and realistic as possible. The resulting program produces outputs resembling those observed in the real world, and inspires confidence that the real causal process has been accurately represented. However, because the assumptions incorporated in the model are complex and their mutual interdependencies are obscure, the simulation program is no easier to understand than the real process was.” With this contention in mind, the model presented in figure 2.1 only has the important functional entities and attributes of the coastal system. As stated by Hall and Day (1977, p. 8), a model “cannot have all attributes, or it would not be a model-it would be the real system.” Coastal geomorphologists, coastal engineers, and allied scientists will, no doubt, argue that the entities specified as independent and dependent in figure 2.1 are not so in reality. This may be partially true, but it should be emphasized that field researchers have not been able to determine and delineate the independent and dependent entities governing the coastal system. Although considerable progress has been made in the area of coastal system dynamics it must, nevertheless, be admitted that the results of field studies are inadequate (Short, 1979) and subject to uncertainties (Hallermeier, 1984; Le Mkhaut6 and Wang, 1984). Simulation modelers who attempt to parameterize models have also found field results to be “loose”. Lehman (1977) emphasized the “looseness” of verbal work by stating that a loose verbal theory is frequently riddled with gaps in logic, contradictions, relationships left unspecified, unknown parameters, and a host of other difficulties. Moreover, Novosad (1982) claimed that all verbal languages tend to be ambiguous and indistinct in terms of presenting complex ideas or descriptions.
22
ENTITIES
MODEL INPUT AND
I
ACTIVITY OVER TIME
I
SLOPE
COASTAL QEOMETRY
Y
COASTAL CHANGE
-
SPATIAL AND TEMPORAL
I
1 -
WAVES
I
WIDTH LENQTH
RANDOM VARIATIONS, PERIOD
RAYLEIQH DISTRIBUTED, AUTOREQRESSIVE SERIES
CURRENT
VELOCITY
TIDES
RANQE
MATERIAL CHARACTERlSTlCS
CONCAVE POWER FUNCTIONS, AND ANY THREE DIMENSIONAL BATHY METRIC TOPOQRAPHY
SIZE
EXPONENTIALLY DISTRIBUTED NORMALLY DISTRIBUTED LOQ
- NORMALLY
DISTRIBUTED
J
DEPENDENT ENTITIES
SUSPENDED I
LOAD
LONQSHORE TRANSPORT
I
QUANTITY RANDOM VARIATIONS QUANTITY
CUMULATIVE AMOUNT IS EXPONENTIALLY DISTRIBUTED
Figure 2.1: Schematic framework of simple conceptual model for simulating coastal changes.
23 (4) Data Collection and Preparation
Input data provide the driving force for a simulation model (Banks and Carson, 1984, p. 332). One of the principal task’s in the simulation process is therefore to gain adequate information and data on the entities and attributes which govern the coastal system. The data must be collected over a continuous and long period of time, so that the time-series provide accurate insights on the nature of the entities which affect the coastal system. The data must also be homogeneous and free from measurement errors. The actual collected data are not used directly in the simulation model because of reasons which have been put forward by Shannon (1975, p. 27). These reasons are:
’
(a) First, the use of raw empirical data implies that all one is doing is simulating the past. The use of data from last year would effectively be replicating only the performance of that year; the only events possible are those that have occurred. It is also one thing to assume that the basic form of the distribution will remain unchanged with time, and quite another to assume that the idiosyncracies of a particular time period will be repeated. (b) It is more efficient of computer time and storage requirements to use a theoretical probability distribution. (c) It is much easier to change the parameters of a theoretical distribution generator to perform sensitivity tests or ask “what if questions?”
With these conditions in mind, it is imperative to analyze the empirical data, and identify the probability distributions of those attributes (see Fig. 2.1) which are to be incorporated in the model. The model will then use the data indirectly by drawing variates from the selected theoretical distribution (see F’ranta, 1977). In the continuous case, identification of the underlying theoretical distribution begins by developing a frequency distribution or histogram of the raw empirical data. The histogram or frequency distribution is then compared visually with as many theoretical distributional forms as possible. Several graphical representations of theoretical distributions can be found in Schmidt and Taylor (1970). In addition, various studies provide information and representation for each of the various theoretical distributions. Discussions on continuous distributions (for example, exponential, beta, gamma, lambda, Erlang-K, Cauchy, Laplace, hyperexponential, chi-square, F-distribution, t-distribution, Gumbel, Weibull, log-normal and triangular ) and discrete distributions (for example, Poisson, binomial, geometric, hypergeometric, Bernoulli and Pascal ) can be found in several sources (for example, Johnson and Kotz, 1970; Derman e t al., 1973; Haan, 1977; Hines and Montgomery, 1980; Ross, 1981; Law and Kelton, 1982; Bratley el al., 1987; International Mathematical and Statistical Library, 1987). With proper judgment, the heuristic decision can be made that the empirical data conform to at least one of the known theoretical distributions. If this is done, a distributional
24 assumption will have to be made because the visual comparison procedure does not provide sufficient justification to accept a specific theoretical distribution. It is, therefore, necessary to establish the null hypothesis, and test the deviation of the empirical distribution from the selected theoretical distribution. This is done by testing the distributional assumption and the associated parameter estimates for goodness-of fit. Several goodness-of-fit tests (for example, Anderson-Darling, Cram&-von Mises, Moments, chi-square and Kolmogorov-Smirnov) have been put forward. Moreover, “it is doubtful that a single ideal goodness-of-fit statistic exists” (Fotheringham and Knudsen, 1987, p. 46). The goodness-of-fit that is chosen to test how well the selected distribution “fits” the observed data depends on several considerations; among them sample size, type of data, and nature of distributional assumptions. Lakhan (1984)found that the chi-square test is valid for large sample sizes while the Kolmogorov-Smirnov test is appropriate for small sample sizes, and when parameters have not been estimated from the data. The Cram&-von Mises test is more sensitive than the Kolmogorov-Smirnov test. Hypotheses are formed once the decision has been made to use a particular goodnessof-fit test, and the simulationist has selected a theoretical distribution based on the visual comparison procedure. For example, assuming that the Rayleigh distribution form appears to fit long-term significant wave height data, then the hypotheses to be tested are:
Ho : the random variable is Rayleigh distributed; HI : the random variable is not Rayleigh distributed. With a large data set, the chi-square goodness-of-fit test can be used to find out whether the Rayleigh distribution is in agreement with the observed significant wave height data. If the data are rejected as being Rayleigh distributed, then the simulationist has to make a different distributional assumption in order to “fit” a theoretical distribution form (for example, Weibull, exponential and log-normal ) to the observed data. This procedure is repeated until a “fit” between the selected distributional form and the collected data is obtained. Following this “trial-and-error” procedure is obviously very time consuming. Hence, a common procedure which can be used is to form a short list of theoretical distributions which can possibly fit the data, and then select, with the aid of a Goodness-of-fit computer program, the best fit out of the short list. Lakhan (1984)presented a FORTRAN ’77computer program which provides the capabilities of running chi-square, Kolmogorov-Smirnov, Cram&-von Mises, and Moments goodness-of-fit tests to check whether a set of empirical observations follow or conform to any one of the following theoretical distributions: (1) Cauchy (2) Chi-square (3) Erlang-K
(4) Exponential (5) Gamma (6) Gumbel
(8) Normal (9) Poisson
(10) (11) (12) (13)
Rayleigh Triangular Uniform Weibull
25 With the level of significance (a)= 0.05, Lakhan (1984) found that with the chi-square goodness-of-fit test, long-term mean wave height and wave period data are Rayleigh distributed; tidal range is normally distributed; longshore current velocity is exponentially distributed, and sediment size is log-normally distributed. Hence, a model to predict spatial and temporal coastal changes can be parameterized with these distributions (see Fig. 2.1). If it is found that at a set level of significance the empirical coastal data cannot be fitted to any of the known continuous or discrete distributions, then it is possible that the data are not independent and uncorrelated. To determine whether this is the case, the simulationist must use the data to compute the autocorrelation and the power spectral density functions. The principal application for an autocorrelation function measurement of physical data is to establish the influence of values at any time over values at a future time. The power spectral density function measurement establishes the frequency composition of the data which, in turn, bears important relationships to the basic characteristics of the physical system (i.e., the coastal system involved) Bendat and Piersol (1971, pp. 22 and 24). The mathematical derivation and uses of these two techniques have been described in a number of studies, including Blackman and Tukey (1958), Brown (1962), Jenkins and Watts (1969), Anderson (1971), Koopmans (1974), Kashyap and R ~ (1976), Q Box and Jenkins (1976), Gottman (1981), Priestly (1981) and Chatfield (1984). For computational efficiency it is recommended that the autocorrelation coefficient rk of the order k of the time series q ,t = 1,. . . ,N be calculated with:
where, Tt is the mean of the first N - k values "1,. . . ,X N - k, Zt + k is the mean of the last N - k values zk + 1,. . . ,X N ,k is the lag of the autocorrelation coefficient, N is the number of events forming the time series, and rk is the kth-order autocorrelation coefficient. This equation gives rk = 1 for k = 0 so the correlogram always starts at unity at the origin. In general -1 5 rk 5 +1 (Salas et al., 1980, p. 39). The power density spectrum can be computed with the formula:
where p(fi) is the power density function, L is called the truncation point. As L increases the variance of the sample spectral density can also increase. The lag window is g ( K ) . The lag window g(K) is used to reduce the variance of the sample spectral density function; with the different types of lag windows (see types in Blackman and Tukey, 1959; Parzen, 1963) having wide ranging reduction efficiencies. According to Bolch and Huang (1974), the Parzen (1963) window, which will never lead to negative estimates of the spectral density function, seems to be better than those proposed by other investigators. The expression for the Parzen window is:
26
- 0.5 1 - 0.75
1
Figure 2.2: Correlogram of significant wave heights.
It is necessary to specify the number of lags K to be used in the calculation of the autocorrelation function. The rule of thumb is that the maximum value for K (number of lags) should not exceed one fifth of the value of N (number of observations). By using the above procedures, Lakhan and LaValle (in press) found that Harrison e l al.'s (1968) significant wave height and significant wave period data collected from Virginia Beach, Virginia are highly autocorrelated. The correlograms for significant wave height (Fig. 2.2) and significant wave period (Fig. 2.3) show that the values of the collected data are not independent of each other. To account for this autocorrelation, a parameterization scheme can be designed whereby the model uses specially designed versions of autoregressive, autoregressive moving average, autoregressive integrated moving average, parsimonious periodic auto-regressive shifting level or fractional Gaussian noise procedures. The decision to use a particular parameterization procedure depends principally on the length of the simulation run, and the nature and characteristics of the entities and their corresponding attributes which govern any particular coastal system. If the analyzed coastal data are found to be autocorrelated, and also at a specified level of significance fit a selected theoretical distribution, then the model can benefit from being parameterized with autocorrelated distributed variates. Lakhan (1981) presented an easily implemented and computationally efficient method for the generation of autocorrelated
27
- 0.25 - 0.5 - 0.75-1 AUTOCORRELATION
-
2 3 $
:$ :8
2 $ %
:;?
+’ *‘
? 4 ” ? ? P ” “ l l l l l ” l l
:
Figure 2.3: Correlogram of significant wave periods. pseudo-random numbers with specific continuous distributions. The method essentially follows the sequence: uniform to normal; linear transform on normals to introduce correlation; transform back to correlated uniforms; final transform to the desired distribution. When the data are poorly correlated, and also cannot be “fitted” to any of the known theoretical distributions, then the only option will be to use the empirical form of the distribution. The collected data can be used to construct the cumulative distribution function. The use of an empirical distribution function is generally not desirable in a stochastic simulation model of the coastal system. In the absence of coastal data, theoretical principles governing the entities can be used for developing meaningful theoretical distributions. For example, Copeiro (1979) claimed that some investigators are placing less emphasis on the use of goodness-of-fit tests and are adopting strictly theoretical principles for describing wave heights. This is a possible approach in the absence of wave data, because it has been claimed that the Rayleigh distribution can be used to describe wave amplitudes of all band widths. The Rayleigh distribution which, in brief, is based on the assumption that ocean waves are the result of the superposition of many sinusoidal components within a narrow band of frequencies, but of random phase, can be used for describing the probability of wave height occurrence (see Longuet-Higgins, 1952; Tayfun, 1977). It should, however, be pointed out that there is disagreement on the appropriateness of the Rayleigh distribution to describe long-term wave data (Harris, 1972; Khanna and Andru, 1974; Ou and Tang, 1974).
28 (5) Model Translation
Once the coastal system model is developed with the appropriate equations, theoretical distributions, data specifications, logic rules, and constraints, a computer program must be written to implement the model. The computer program will embody the necessary logic rules, the equations and their solutions, etc. The model must be programmed for a digital computer using an appropriate computer language. Sargent (1984) advocates the use of higher level languages or special simulation languages in coding the simulation to reduce the coding errors. The choice of an appropriate simulation language for coastal studies is difficult, because there are a large number of computer languages available for discrete and continuous simulations. Kreutzer (1976) reviewed several of the early simulation languages (for example, GPSS v, SIMSCRIPT, SOL, SIMULA, CSL and GASP ). Moreover, in the last decade there has been a proliferation of simulation languages, and this is evident by the numerous titles published in the catalogues of the Society for Computer Simulation (1984, 1985). The overall benefit of using special purpose simulation languages lies in the fact that they shorten the development of the computer program, and also allow for faster coding and verification. In addition, special purpose simulation languages incorporate common simulation functions (for example, on management and advancement of simulation time, generation of random variates, and event executions) which can enhance computational efficiency. Although it is apparent that the use of special purpose simulation languages will be of considerable benefit to coastal simulation investigations, it is unfortunate that only a very small percentage of these languages are available on computer installations easily accessible to coastal researchers. Hopefully, the advanced special purpose languages like PROLOG (see Colmerauer el al., 1981; Adelsberger, 1984; Burnham and Hall, 1986; Vaucher and Lapalme, 1987), Ada (see U.S.Department of Defence, 1982; Booch, 1983; Pyle, 1985), C (see Ritchie, 1980; Traister, 1984; Gehani, 1985), Modula-2 (see Wirth, 1982, 1985; Beidler and Jackowitz, 1986), ModSim (see Mullarney and West, 1987) and Ross (McArthur e t al., 1984) will become as common and popular as the general purpose languages (for example, ALGOL, BASIC, FORTRAN, PL/1, and PASCAL ) which are currently used for coastal and other kinds of simulation models. The applicability of the advanced special purpose languages for simulation modeling has been demonstrated in several recent studies. For instance, Adelsberger and Neumann (1985) discussed goal oriented simulation using PROLOG, and Unger (1986) provided an account of object oriented simulation using Ada, C++, and Simula. (6)P r o g r a m Verification
With complex models, it is difficult, if not impossible, to code a model successfully in its entirety without a good deal of debugging (Banks and Carson, 1984, p. 14). It is therefore necessary to verify the written computer program. Verification is essentially the process of checking the subpieces of the program when they come into being. It moves the process of
29 checking forward in time, and allows mistakes and oversights to be caught early. Testing is a part of verification (Fox, 1982). Several studies have recommended the use of software engineering techniques for writing, and testing the accuracy of computer programs (see Yeh, 1977; Myers, 1979; Sommerville, 1982; Wiener and Sincovec, 1984; Poncet, 1987). From a software engineering point of view, a program must work according to its specifications. In addition, the manner in which the program is designed and written is as important as whether the program works. The objective of software testing is to detect program errors. Debugging is part of the testing process (Wiener and Sincovec, 1984). Sargent (1982) summarized the static and dynamic approaches for testing the accuracy of computer programs. In static testing, the computer program of the computerized model is analyzed to determine if it is correct by using such techniques as correctness proofs, syntactic decomposition, and examination of the structural properties of the program. Dynamic testing, on the other hand, consists of different strategies, which include bottom-up testing, topdown and mixed testing (a combination of top-down and bottom-up testing). In dynamic testing, the techniques which are often applied are internal consistency checks, traces, and investigations of input-output relations. Although it is beneficial for the coastal simulationist to make use of techniques espoused for software development and testing, simple and innovative verification techniques should not be neglected. To make certain that the computer simulation program accurately represents the formulated model, a simple verification technique can be employed to manually check the logic and solution techniques of the program. This is best done using what this author refers to as the Network of Collaborating Colleagues (N.C.C.), whereby colleagues who have expertise in computer science, mathematics, statistics, programming, etc. are called upon to verify the accuracy of specific program modules. While this is being done, the simulationist should be checking to find out whether equations, distributions, integration subroutines, etc., have been programmed correctly by solving for cases which can be compared against known solutions. For example, the simulationist can manually find the wave celerities and wave lengths at every 10 metres for a wave, with say a period of 5 seconds, which is propagating shoreward over a uniform bottom from a depth of 100 metres to a depth of 10 metres. The results should then be compared against those of known solutions. By using similar input values as for the manual case, the program which has been checked by the N.C.C. can then be executed for a few “spot” iterations. The computer generated and manually obtained results can then be compared for reasonableness. When there are no major discrepancies between the two results, the simulationist can confidently use other computer verification techniques. Another recommended computer program verification technique, referred to as stress testing, is based on the notion that while “bugs” in the program which are both embarrassing and difficult to locate do not manifest themselves immediately or directly, they may show up under stress. With stress testing the parameters of the model are set to unusually high
30 or low values, and then the model is checked to see if it “blows up” in an understandable
manner (see Bratley et al., 1987). The major task of validating the computer simulation model must be accomplished, once it is ascertained that the computer simulation program is performing exactly the way it is intended. Before validation techniques are discussed it should be emphasized that “verification and validation, although conceptually distinct, are usually conducted simultaneously” (Banks and Carson, 1984, p. 383). (7) Model Validation The problem of validating computer simulation models is indeed a difficult one because it involves a host of practical, theoretical, statistical, and even philosophical complexities (Naylor et al., 1966, p. 39). This problem arises because all simulation models contain both simplifications and abstractions of the real world system, and as such no model is absolutely correct in the sense of a one-to-one correspondence between itself and real life (Shannon, 1975). Simulation models therefore have various degrees of correctness and validity, and as such scientists and philosophers have not been able to develop a standard set of procedures for validating computer simulation models. Balci and Sargent (1984) provide a lengthy bibliography of studies which present various approaches and statistical techniques for the validation of simulation models. Banks et al. (1986a, 1986b), have proposed a methodology for verification and validation of simulation models, and they also discussed various unresolved issues. Without discussing the various validation approaches put forward in the simulation literature, the philosophy of science teaches that models cannot be validated, because doing so implies that the model represents “truth” (see Popper, 1968). By neglecting the strict definitional sense of the word “valid”, the coastal simulationist can develop a valid model by making certain that there is very close correspondence between the model and the coastal system. This is done by building a model that has high face validity. The model will be valid if it accurately predicts the performance of the coastal system. If this is not achieved then it is necessary to examine, and change if necessary, the structural and data assumptions of the model. With repeated adjustments to these assumptions, and further verification of the computer program, the model should be able to generate data similar to that of the natural coastal system. Several types of subjective and objective tests have been put forward to assess the validity of a simulation model. Examples of subjective validation techniques are the Turing Test (Turing, 1963), historical methods (Naylor and Finger, 1967), predictive validation (Emshoff and Sisson, 1970), field tests (Van Horn, 1971), Schellenberger’s Criteria (Schellenberger, 1974), sensitivity analysis (Miller, 1974; Shannon, 1975) and multistage validation (Law and Kelton, 1982). Although each of the subjective validation techniques is applicable to the type of system being simulated, it should be pointed out that many of the subjective validation techniques can be used for coastal system investigations. For instance, the Turing Test or the historical data validation test can be used without diffi-
31 culty. To apply the Turing Test, the simulationist presents output data from the simulation model and collected coastal data are to coastal experts without telling them beforehand the differences between the two data sets. If the “experts” can determine which data set is the output from the simulation model, and which has been collected from the field, and can also provide rational reasons for being able to do so, then the simulation model leaves scope for improvement. Another of the subjective validation techniques which can be used involves the use of some historical coastal data in the simulation model. The simulation is executed, and the generated output data are compared with the results which have been historically produced by the coastal system operating under similar conditions. Given the fact that data are available from the coastal system, then it is possible to employ objective validation techniques to compare the coastal simulation output data with those collected from a corresponding coastal system. This requires the use of statistical techniques. Among some of the statistical validation techniques which have been used to compare simulated data with those of real-world data, are analysis of variance (Naylor and Finger, 1967), nonparameteric goodness-of-fit tests (Gafarian and Walsh, 1969)’ correlation and spectral analysis (Watts, 1969), multivariate analysis of variance (Garratt, 1974), confidence intervals and nonparametric tests of means (Shannon, 1975), Theil’s Inequality Coefficient (Theil, 1961; Kheir and Holmes, 1978) and Hotelling’s T2 Tests (Balci and Sargent, 1982). In performing statistical hypothesis testing, the emphasis should be on reducing type I1 error, the probability of accepting the null hypothesis when it is false, instead of type I error, which is the probability of accepting the alternate hypothesis when it is false. This is because one wants to avoid accepting a model that is not valid (Sargent, 1982, p. 167). (8) Experimental Design
Several experiments can be conducted with a valid coastal system model to yield desired information on spatial and temporal changes. To experiment with the model, both the strategic and tactical aspects of simulation experimentation must be considered (see Shannon, 1975; Kleijnen, 1979). Technical details on the strategic and tactical aspects of simulation experimentation have been provid-d by Kleijnen (1982) and Fishman (1978) respectively. Almost all simulation allows for greater control to be exercised over the experimental conditions than is normally possible with laboratory experiments. However, Shannon (1975) correctly pointed out that few simulation studies do not have resource limitations imposed upon them in terms of time, money, and computer availability. These constraints place very severe restrictions on the experimental design, and often override academic considerations. Hence, to obtain the desired information at minimal cost, and still be able to draw valid inferences from the experimentation, several tactical decisions must be made. These may include initialization and start-up conditions, number of iterations, the difference in time between iterations, and the length of the simulation run. To emphasize how difficult it is to make tactical decisions, let us assume that the simulationist is interested in simulating not only temporal coastal changes, but also steady state
32 or equilibrium coastal conditions. Specifying the runlength of the simulation in this case becomes a major tactical problem, as it cannot be easily predetermined when a steady state condition will be attained by the simulation. For coastal simulations, it is therefore necessary to have prior knowledge on the overall physical dynamics and morphological states of the coastal system. If it is known that at a certain coastal location of interest, waves arrive at the shore, on the average every 5 seconds, then the simulation experiment can be controlled so that the difference in time between each iteration is 5 seconds. For a one day period, the simulation will have 17280 iterations. Fixing the simulation runlength for 30 days will yield 518400 iterations. Doing this will not only be extremely costly in terms of computing resources, but the generated output will be almost intractable. By changing the experimental design, the number of iterations can be reduced from 518400 to 618 if each iteration corresponds to 70 minutes of real time, or roughly onetwelfth of a tidal cycle. Using this approach, Lakhan and Jopling (1987) parameterized a model every 70 minutes with different deepwater wave height, wave period and tidal range values and successfully simulated both barred and nonbarred profde configurations during the simulation runlength of 1512 iterations or 73 1/2 days. To simulate a long period of time or geologic time with limited computing resources, it is best to perform calculations over small time increments, and then extrapolate the results through pre-determined time increments. Chapter 10 in this volume discusses how time is extrapolated to simulate delta formation over a period of 2000 years. If it is known that excessive output will be produced over the runlength of the simulation program, then from a strategic viewpoint it is advisable to utilize a statistical methodology for generalizing the results of the simulation experiment. With excessive output, a model of the model (a metamodel ) is normally needed to understand the simulation model. Kleijnen (1977) reviewed several types of metamodels, including analysis of variance and regression metamodels. (9) Model Results and Interpretation
After execution of the simulation model on a digital computer, the generated results or simulation outputs must be analyzed and interpreted. The interpretation of the results involves a translation from “model space” to “prototype space”. In this process the simulationist goes, in reverse, through the chain of approximations and simplifications made during the modeling process (Jacoby and Kowalik, 1980). The simulationist should consider the fact that most simulations have data and parameter inaccuracies, solution method limitations, and many mathematical errors relating to round-off and cancellation. If the simulation outputs are used to draw inferences and to test hypotheses, the results cannot be interpreted with the use of statistical tests. Since stochastic variables are used in a simulation, then the simulation output data are themselves random. Lakhan and LaValle (1987) found that coastal simulation output data are subject not only to random fluctuations, but also to various degrees of correlation. Therefore, to interpret the simulation outputs,
33
and make inferences on how the coastal or any other type of system operates, it is necessary to have both theoretical and practical knowledge of the statistical aspects of Simulation experimentation (initial conditions, data translation, replications, runlength, etc.) and data outputs. Law and Kelton (1982) presented some statistical techniques which are useful for analyzing simulation output data, and also stressed that there are several output analysis problems for which there are no completely accepted solutions. Interpretation problems can also arise based on degree of face validity of the model. Assuming that a model of high face validity has been executed for several long simulation runs,then it can produce results which provide insights on the behavior of the simulated system at different time periods. When this is so, it becomes a challenge to interpret the results, especially when the results pertain to conditions and events which cannot be easily corroborated by observations or historical data.
References Adelsberger, H.H., 1984. PROLOG as a simulation language. PTOC.1984 Winter Simulation Conf.: 501-504. Adelsberger, H.H., and Neumann, G., 1985. Goal oriented simulation modeling using Prolog. PTOC.Conf. on Modeling and Simulation on Microcomputers. SCS, San Diego, California: 42-47. Adkins, G., and Pooch, U.W., 1977. Computer simulation: A tutorial. Computer, 10: 1217. Anderson, T.W., 1971. The Statistical Analysas of Time Series. John Wiley and Sons, Inc., New York. Balci, O., and Sargent, R.G., 1984. A bibliography on the credibility assessment and validation of simulation and mathematical models. Sirnuletter, 15: 15-27. Banks, J., and Carson, J.S., 11, 1984. Discrete-event System Simulation. Prentice-Hall, Inc., Englewood Cliffs, New Jersey. Banks, J., Gerstein, D.M., and Searles, S.P., 1986a. The verification and validation of simulation models: A methodology. Tech. Report, School of Industrial and Systems Eng., Georgia Tech., Atlanta, Georgia. Banks, J., Gerstein, D.M., and Searles, S.P., 1986b. The verification and validation of simulation models: Unresolved issues. Tech. Report, School of Industrial and Systems Eng., Georgia Tech., Atlanta, Georgia. Beidler, J., and Jackowitz, P., 1986. Modula-2. PWS Publishers, Boston, Massachusetts.
34 Bendat, J.S., and Piersol, A.G., 1971. Random Data: Analysis and Measurement Procedures. Wiley Interscience Publ., New York. Blackman, R.B., and Tukey, J.W., 1958. The measurement of power sprectrum from the point of view of communications engineering. Bell Systems Tech. Jour., 37: 185-282, 485-569. Blackman, R.B., and Tukey, J.W., 1959. The Measurement of Power Spectra. Dover Publ., New York. Boehm, B.W., 1981. Software Engineering Economics. Prentice-Hall, Inc., Englewood Cliffs, New Jersey. Bolch, B.W., and Huang, C., 1974. Multivariate Statistical Methods for Business and Economics. Prentice-Hall, Inc., Englewood Cliffs, New Jersey. Booch, G., 1983. Software Engineering with Ada. Benjamin/Cummings, New York. Box, G.E.P., and Jenkins, G.M., 1976. Time Series Analysis, Forecasting and Control. Holden-Day Inc., California. Bratley, P., Fox, B.L., and Schrage, L.E., 1987. A Guide to Simulation. 3rd Ed., SpringerVerlag, New York. Brown, R.G., 1962. Smoothing, Forecasting and Prediction of Discrete Tame Series. PrenticeHall, Inc., Englewood Cliffs, New Jersey. Burnham, W.D., and Hall, A.R., 1986. PROLOG MacMillan, London.
Programming
and
Applications.
Cellier, F.E., 1982. Progress an Modelling and Simulation. Academic Press, Inc., London. Chatfield, C., 1984. The Analysis of Time Series: An Introduction. Arrowsmith Ltd., Bristol, England.
3rd Ed., J.W.
Colmerauer, A., Kanoui, H., and Van Canegham, M., 1981. Last Steps Toward an Ultimate PROLOG. IJCAI-81, Vancouver, Canada: 947-948. Copeiro, E., 1979. Extremal prediction of significant wave height. Proc. 16th Int. Conf. Coastal Eng., ASCE: 284-304. Derman, C., Gleser, L.J., and Olkin, I., 1973. A Guide to Probability Theory and Application. Holt, Rinehart and Winston, Inc., New York. Dutton, J.M. and Staxbuck, W.H., 1971. Computer Simulation of Human Behaviour. Academic Press, Inc., New York.
35
Emshoff, J.R., and Sisson, R.L., 1970. Design and Use of Computer Similation Models. Macmillan and Sons, New York. Fishman, G.S., 1973. Concepts and Methods in Discrete Event Digital Simulation. John Wiley and Sons, Inc., New York. Fishman, G.S., 1978. Principles of Discrete Event Digital Simulation. John Wiley and Sons, Inc., New York. Fotheringham, A.S., and Knudsen, D.C., 1987. Goodness-of-Fit Statistics. Norwich, England.
Geo Books,
Fox, J.M., 1982. Software and its Development. Prentice-Hall, Inc., Englewood Cliffs, New Jersey. Franta, W.R., 1977. The Process View of Simulation. Elsevier North-Holland, Inc., New York . Gafarian, A.V., and Walsh, J.E., 1969. Statistical approach for validating simulation models by comparison with operational systems. In: Proc. 4th Int. Conf. Operations Research. John Wiley and Sons, New York: 702-705. Garratt, M., 1974. Statistical validation of simulation models. In: Proc. 1974 Summer Computer Simulation Conf. Simulation Councils, La Jolla, California: 915-926. Gehani, N., 1985. Advanced C: Food for the Educated Palate. Computer Science Press, Rockville, Maryland. Gordon, G., 1969. System Simulation. Prentice-Hall, Inc., Englewood Cliffs, New Jersey. Gottman, J.M., 1981. Time Series Analysis. A Comprehensive Introduction for Social Scientists. Cambridge Univ. Press, Cambridge, England. Graybeal, W.J., and Pooch, U.W., 1980. Simulation: Principles and Methods. Winthrop Publishers Inc., Cambridge, Massachusetts. Haan, C.T., 1977. Statistical Methods in Hydrology. Iowa State Univ. Press, Iowa, U.S.A. Hall, C.A.S., and Day, J.W., 1977. Systems and models: Terms and basic principles. In: C.A.S. Hall, and J.W. Day (Editors), Ecosystem Modeling in Theory and Practice. Wiley Interscience Publ., New York: 6-36. Hallermeier, R., 1984. Added evidence on new scale law for coastal models. PTOC.19th Int. Conf. Coastal Eng., ASCE: 1227-1243. Harris, D.L., 1972. Characteristics of wave records in the coastal zone. In: R.E. Meyer, R.E., (Editor) Waves on Beaches and Resulting Sediment Ransport. Academic Press, Inc., New York: 1-51.
36 Harrison, W., Rayfield, E.W., Boon, J.D., Reynolds, G., Grant, J.B., and Tyler, D., 1968. A time series from a beach environment. Tech. Mem. ERLTM- AOL 1, ESSA Res. Lab. Hines, W.W., and Montgomery, D.C., 1980. Probability and Statistics in Engineering and Management Science. 2nd Ed.. John Wiley and Sons, Inc., New York. Ingels, D.M., 1985. What Every Engineer Should Know About Computer Modeling and Simulation. Marcel Dekker, Inc., New York. International Mathematical and Statistical Library, 1987. IMSL Library Reference Manual, Vols. 1 and 2. Imsl Inc., Houston, Texas. Jacoby, S.L.S., and Kowalik, J.S., 1980. Mathematical Modeling with Computers. PrenticeHall, Inc., Englewood Cliffs, New Jersey. Jenkins, G.M., and Watts, D.G., 1969. Spectral Analysis and its Applications. Holden-Dayl San Francisco, California. Johnson, N.L., and Kotz, S., 1970. Continuous Univariate Distributions. Mifflin Co., Boston, Massachusetts.
I. Houghton
Karplus, W., 1976. The spectrum of simulation. In: L. Dekker (Editor), International Congress on Simulation of Systems. North Holland Publ. Co., The Netherlands: 1927. Kashyap, R.L., and Rao, A.R., 1976. Dynamic Stochastic Models from Empirical Data. Academic Press, Inc., London. Kaufman, W.C., and Pilkey, O.H., Jr., 1983. The Beaches Are Moving. The Drowning of America's Shoreline. Duke Univ. Press. North Carolina. Khanna, J., and Andru, P., 1974. Lifetime wave height curve for Saint John Deep, Canada. Ocean Wave Measurement and Analysis, ASCE: 301-319. Kheir, N.A., and Holmes, W.M., 1978. On validating simulation models of missile systems. Simulation, 30: 117-128. Kleijnen, J.P.C., 1974. Statistical Techniques in Simulation. Part I. Marcel Dekker, Inc., New York. Kleijnen, J.P.C., 1977. Generalizing simulation results through metamodels. Working Paper 77.070, Dept. Business and Economics, Katholieke Hogeschool, Tilburg, Netherlands.
37 ICleijnen, J.P.C., 1979. The role of statistical methodology in simulation. In: B. Zeigler, M.S. Elms, G.J. Klir, and T.I. Oren (Editors), Methodology in Systems Modelling and Simulation. North-Holland Publ. Co., Amsterdam: 425-445. Kleijnen, J.P.C., 1982. Experimentation with models: Statistical design and analysis techniques. In: F.E. Cellier (Editor), Progress in Modelling and Simulation. Academic Press, Inc., London, England. ICoopmans, L.H., 1974. The Sprectral Analysis of Time Series. Academic Press, Inc., New York . Kreutzer, W., 1976. Comparison and evaluation of a discrete event simulation programming languages for management decision making. In: I. Dekker (Editor), Simulation of Systems, North-Holland Publ. Co., Amsterdam: 429-438. Lakhan, V.C., 1981. Generating autocorrelated pseudo-random numbers with specific distributions. JOUT.Statistical Computation and simulation, 12: 303-309. Laklian, V.C., 1984. A Fortran '77 Goodness-of-Fit Program: Testing the Goodness-of-Fit of Probability Distribution Functions to Frequency Distributions. Tech. Publ. No. 2, Int. Computing Lab. Inc., Toronto, Ontario, Canada. Lakhan, V.C., and Jopling, A., 1987. Simulating the effects of random waves on concaveshaped nearshore profiles. Geografislca Annaler 69A: 251-269. Lakhan, V.C., and LaValle, P.D., 1987. Modelling and simulating nearshore profde development. Presented at the CAGONT/ELDAAG Conf., October 16, 1987, Univ. Windsor, Ontario. Lakhan, V.C., and LaValle, P.D., in press. Development and testing of a stochastic model to simulate nearshore profile changes. Studies i n Marine and Coastal Geogr., No. 7, Saint Mary's University, Halifax (in press). Law, A.M., and Kelton, W.D., 1982. Simulation Modeling and Analysis. McGraw-Hill Co., New York. Lehman, R.S., 1977. Computer Simulation and Modelling: A n Introduction. John Wiley and Sons, Inc., New York. Le Mkhautk, B., and Wang, S., 1984. Effects of measurement error on long-term wave statistics. PTOC.19th Int. Conf. Coastal Eng., ASCE: 345-361. Longuet-Higgins, M.S., 1952. On the statistical distribution of the height of sea waves. JOUT. Marine Res., 11: 245-26G. McArthur, D., IClahr, P., and Narain, S., 1984. ROSS: An object-oriented language for constructing simulations. The Rand Corp., R-3160-AF.
38 Mihram, G.A., 1972. Simulation: Statistical Foundations and Methodology. Academic Press, Inc., New York. Miller, D.R., 1974. Model validation through sensitivity analysis. PTOC.1974 Summer Computer Simulation Conf., Simulation Councils, La Jolla, California: 911-914. Morgan, B.J.T., 1984. Elements of Simulation. Chapman and Hall, London. Morse, P.M., 1977. ORSA twenty-five years later. Oper. Res., 25: 186-188. Mullarney, A., and West, J., 1987. ModSim: A language for object-oriented simulation. Design specification. CACI Tech. Report, LaJolla, California. Myers, G.J., 1979. The Art of Software Testing. John Wiley and Sons, Inc., New York. Nance, R.E., 1981. Model representation in discrete event simulation: the conical methodology. Tech. Report CS81003-R, Virginia Tech., Blacksburg, Virginia. Naylor, T.H., Balintfy, J.L., Burdick, D.S., and Kong, C., 1966. Computer Simulation Techniques. John Wiley and Sons, Inc., New York. Naylor, T.H., and Finger, J.M., 1967. Verification of computer simulation models. Management Scz., 14: B92-Bl01. Nihoul, J.C.J., (Editor), 1975. Modelling of Marine Systems. Elsevier Scientific Publ. Co., Amsterdam. Novosad, J.P., 1982. Systems Modeling and Decision Making. Kendall/Hunt Publishing Co., Iowa. Ord-Smith, R.J., and Stephenson, J., 1975. Computer Simulation of Continuous Systems. Cambridge Univ. Press, Cambridge, England. Oren, T.I., 1984. Foreward. In: Zeigler, B.P., Multifacetted Modelling and Discrete Event Simulation. Academic Press, London. Orme, A.R., 1982. Temporal variability of a summer shore zone. In: C.E. Thorn (Editor), Space and Time in Geomorphology. George Allen and Unwin, London: 285-313. Ou, S.H., and Tang, F.L.W., 1974. Wave characteristics in the Taiwan Straits. Ocean Wave Measurement and Analysis. ASCE: 139-158. Parzen, E., 1963. Notes on Fourier analysis and spectral windows. Tech. Report No. 48, Dept. Statistics, Stanford Univ. In: E. Parzen Time Series Analysis Papers. Holden Day, San Francisco, California.
39 Poncet, F., 1987. SADL: a software development environment for software specification, design and programming. In: G. GOOS,and J. Hartmanis (Editors), Lecture Notes in Computer Science. Springer-Verlag, Berlin: 3-11. Popper, K.R., 1968. The Logic of Scientific Discovery. Harper Torchbooks, New York. Priestly, M.B., 1981. Spectral Analysis and Time Series. Vol. 1, Academic Press, Inc., New York. Pyle, I.C., 1985. The A D A Programming Language. A Guide f o r Programmers. 2nd Ed., Prentice-Hall, Inc., Englewood Cliffs, New Jersey. Ritchie, D.M., 1980. The C Programming Language Reference Manual. AT & T Bell Laboratories, Murray Hill, New Jersey. Rivett, P., 1972. Principles of Model Building. John Wiley and Sons, Inc., London. Ross, S.M., 1981. Introduction to Probability Models. 2nd Ed., Academic Press, Inc., New York. Salas, J.D., Delleur, W., Yevjevich, V.M., and Lane, W.L., 1980. Applied Modeling of Hydrologic Time Series. Water Resources Publ., P.O. Box 2841, Littleton, Colorado, 80161. Sargent, R.G., 1982. Verification and validation of simulation models. In: F.E. Cellier (Editor) Progress in Modelling and Simulation. Academic Press, Inc., London: 159-172. Sargent, R.G., 1984. An expository on verification and validation of simulation models. Unpubl. update of a “Tutorial on verification and validation of simulation models”. In: Proc. 1984 Winter Simulation C o n f , IEEE: 573-577. Schellenberger, R.E., 1974. Criteria for assessing model validity for managerial purposes. Decision Sci., 5: 644-653. Schmidt, J.W., and Taylor, R.E., 1970. Simulation and Analysis of Industrial Systems. Richard D. Irwin, Inc., Homewood, Illinois. Shannon, R.E., 1975. Systems Simulation: The Art and Science. Englewood Cliffs, New Jersey.
Prentice-Hall, Inc.,
Short, A.D., 1979. Wave power and beach stages: A global model. PTOC.16th Int. Conf. Coastal Eng., ASCE: 1145-1162. Society for Computer Simulation (SCS), 1984. Catalog of simulation languages. tion, 43: 180-192.
Simula-
Society for Computer Simulation (SCS), 1985. Additions to the catalog of simulation software. Simulation, 44: 106-108.
40 Sommerville, I., 1982. Software Engineering. Addison-Wesley, Reading, Massachusetts. Spriet, J.A., and Vansteenkiste, G.C., 1982. Computer-aided Modelling and Simulation. Academic Press, Inc., London. Sworder, D.D., 1971. Systems and simulation in the service of society. Simulation Councils PTOC.,1: 149-163. Tayfun, M.A., 1977. Linear random waves on water of non-uniform depth. Ocean Eng. Report 16, Part 11, Dept. Civil Eng., Univ. Delaware, Newark, Delaware. Theil, H., 1961. Economic Forecasts and Policy. Amsterdam, The Netherlands.
North-Holland Publishing Co.,
Tocher, K.D., 1963. The Art of Simulation. D. Van Nostrand Co., Inc., Princeton, New Jersey. Traister, R.J., Sr., 1984. Programming an Jersey.
c:
for the Microcomputer. Prentice-Hall, New
Turing, A.M., 1963. Computing machinery and intelligence. In: E.A. Feigenbaum, and J. Feldman (Editors), Computers and Thought. McGraw-Hill, New York: 11-15. Umphress, D.A., 1987. Model execution in a goal-oriented discrete event simulation environment. Ph.D. Thesis, Texas A & M Univ., College Station, Texas. Unger, B.W., 1986. Object oriented simulation-Ada, Ctt , Simula. PTOC.1986 Winter Simulation Conf., IEEE, Piscataway, New Jersey: 123-124.
U.S. Department of Defence, 1982. Reference Manual for the Ada Programming Language. Draft revised MIL-STD 1815 ACM Ada Tec Special Publ., U.S. Dept. Defence, Washington, D.C.
US. General Accounting Office, 1979. Guidelines for model evaluation. PAD-79-17, U S . G.A.O., Washington, D.C. Van Horn, R.L., 1971. Validation of simulation results. Management Sci., 17: 247-258. Vaucher, J., and Lapalme, G., 1987. Process-oriented simulation in PROLOG. In: P.A. Luker, and G. Birtwistle (Editors) Simulation and AI. SCS San Diego: 4148. Watts, D., 1969. Time series analysis. In: T.H. Naylor (Editor) The Design of Computer Sirnulation Ezperiments. Duke University Press, Durham, North Carolina: 165-179. Wiener, R., and Sincovec, R., 1984. Software Engineering with Modula-2 and Ada. John Wiley and Sons, New York.
41 Wirth, N., 1982. Programming in Modula-2. Berlin.
2nd (corrected) Edition.
Springer-Verlag,
Wirth, N., 1985. Programming in Modula-2. 2nd Ed., Springer-Verlag, New York. Wright, L.D., Short, A.D., and Green, M.O., 1985. Short-term changes in the morphodynamic states of beach and surf zones: An empirical predictable model. Marine Geol., 62: 339-364. Yeh, R.T., 1977. Current Trends in Programming Methodology. Vol. I: Software Specijcation and Design; Vol. 11: Program Validation. Prentice-Hall, Inc., New York. Zeigler, B.P., 1976. Theory of Modelling and Simulation. John Wiley and Sons, Inc., New York . Zeigler, B.P., 1984. Multifacetted Modelling and Discrete Event Simulation. Academic Press, London.