Computers &duc. Vol. 14. No. 3. pp. 263-270. Printed in Great Briram
0360. I3 15!90 53.00 + 0.00 Pergamon Press plc
1990
COMPUTER SIMULATION OF LABORATORY EXPERIMENTS: AN UNREALIZED POTENTIAL D. J. MAGIN’ and J. A. REIZE~~ ’ Professional Development Centre and *School of Mechanical and Industrial Engineering, University of New South Wales, Box I, Kensington, NSW 2033, Australia (Received 18
April 1989)
Abstract-In the early seventies there was a burgeoning interest in the development and use of computer simulated laboratory experiments as replacements for “hands-on” experimentation. Subsequent studies of computer simulated experiments have identified serious limitations inherent in their use. Within engineering, concern has been expressed about the effect of substituting empirically derived output data by values determined through analysis and the manipulation of equations. This has been seen as resulting in students losing sight of the fact that the performance characteristics of devices and the behaviour of complex dynamic processesoften cannot be predicted accurately by analytic methods. The paper describes work currently being undertaken to develop a simulation package (of a heat exchanger) which is designed to overcome these limitations. Built into the software is a set of functions which transform the data output produced by the modelling of heat exchange process into a form which mimics the vicissitudes of measurement and data acquisition which a student is likely to experience during an actual laboratory investigation. The argument is advanced that the new approach has the capability to develop specific experimentation skills previously unachieved either in “hands-on” experiments or in conventional computer simulated experiments; and to promote an enhanced understanding and appreciation of the role of experimentation in engineering.
INTRODUCTION
In 1971 interactive computing facilities became available for use in undergraduate instruction in mechanical engineering courses at the University of New South Wales. In addition, the general purpose programming language, APL, was introduced. These facilities led to the development of a suite of packages which simulated a range of engineering systems and devices[l]. This was seen as opening up the prospect of using simulated experiments as a supplement to and extension of laboratory investigations in undergraduate courses. Although evaluations carried out in 1972[2] and 1973[3] indicated the apparent success of these innovations, by the mid-eighties their use in supplementing laboratory course work has been much diminished. Whilst the initial prospect of using the packages for developing students’ experimentation skills remained largely unrealized, their application in two related areas grew: that of providing a functional understanding of dynamic processes through the manipulation of input parameters and observing their effect on output variables; and that of computer-aided design. The view that such packages should not be used as replacements for. or extensions of, actual laboratory investigation had been espoused by Tawney in 1976]4], and subsequently taken up in later years by a number of other investigators[5,6]. Recently, however, there has been a renewed interest in exploiting the potential of computer simulation packages in the service of developing students’ experimentation skills. These skills were seen to include decision-making relating to procedures used; critical awareness of the methodologies employed and problems of validation; and the interpretation of results, particularly where experimental results are at variance with those predicted by analysis based on theory or modelling. In 1985 Barnard[7] suggested making simulated packages more akin to actual laboratory investigations by building in “system error”. Also, in 1988, Shacham and Cutlip [8] drew attention to the advantages of employing simulated experiment packages which embodied the kinds of variation in output data typically found during laboratory procedures. In their view, an “authoring system has to have capability of adding normally distributed random experimental error to the results if requested by the author” (p. 277). The programming language APL, used in mechanical engineering, has the authoring language capability sought by them. Even in 1971, when APL was first being employed on our own campus, 263
264
D. J. MAGIN and
J. A. SEIZES
this capability was being exploited to provide transformation of data output to a form designed to mimic “normally distributed random experimental error” (as advocated in 1988 by Shacham and Cutlip). Indeed, one of the packages which was already in use in 1972 had built in output data variation of this kind. The publication by De Vahl Davis and Holmes[l] included description of a feature of the “Heat Exchanger” package in which a “small, varying amount of random, normally distributed error has been introduced into the results to simulate experimental scatter” (p. 7). This article describes our own work in the instructional design and software development of a package simulating a heat exchange device. The point of departure from previous approaches to simulated experiments is that the software has built in a set of functions which transforms the data output produced by the modelling of the heat exchange process into a form which mimics the vicissitudes of measurement and data acquisition which a student is likely to experience during an actual laboratory investigation. It is emphasized that the developments described reach beyond that of simply building in “experimental scatter” to simulate errors of calibration and measurement. The behaviour of heat exchangers (in common with many engineering devices based on complex dynamic processes) cannot be predicted accurately by theory and modelling. The vagaries of the heat exchange process itself result in output states which vary over time even though all other relevant parameters are kept constant. Further, individual heat exchangers, built to the same design specifications, often exhibit distinctly different performance characteristics. The features which have been built in to the package have accordingly provided for the “vicissitudes” of output values which would be typically found when repeating performance tests of a particular heat exchanger; and when testing several different heat exchange rigs built to the same specifications. These initiatives have been influenced by developments (both on our own campus and elsewhere) in the use of “simulated experiment” packages in laboratory investigations, and by our experiences in the organization and evaluation of experimental engineering courses. These are outlined in the following two sections. BACKGROUND By the mid-70s the use of simulated packages was well established [4]. Laurillard’s review of current activity in the U.K. at that time evidenced an increased potential for their use in experimentation, whilst also indicating that resistance to their increased use had developed: Several packages have been developed to simulate a practical experiment (e.g. population dynamics), often one that is too difficult or expensive to do in the laboratory. . . . The success of these packages has led to the possibility of some future packages being designed to replace existing laboratory experiments. This idea has met with strong resistance from teachers who feel that students should be able to practise experimental skills. But some experiments have, as one of their aims, the greater understanding of related theory, and it is here where, as the students point out, the computer does better. CAL is likely, therefore, to provide an alternative to at least some aspects of laboratory work[9,p. 2451. At this time several investigators were beginning to express concern about the use of such packages as an alternative to laboratory work. Tawney[4] identified two areas of concern, the first relating to the need for an examination of the objectives of laboratory work: Several authors. _. compare computer simulations with laboratory experiments, regarding them as extensions or replacements for these. This comparison can be unsatisfactory for it avoids the necessity of examining the objectives which laboratory experiments hope to further [4, p. 271 The second concern arose from work at Imperial College in using computer simulations of experiments in heat exchange and fluid flow as part of a course in fluid mechanics. Here it was found that students were not developing a critical awareness of the limitations of simulations as According to Tawney, simulations which are used as representations of actual behaviour. substitutes for, or extensions of laboratory experiments “do not reflect a clear understanding of the relationship between theory and observation in science, in particular, between models and ‘real
Computer simulation of laboratory experiments
265
life’ ” (p. 3). A further point is made that “the processes by which a computer produces tables of data from an equation resemble more closely the processes by which students working with pencil, paper and slide rule would produce them than they do natural phenomena” (p. 20). Barnard[‘l] had also expressed concerns about students’ lack of concern for validating their results, both from laboratory investigation and from the use of simulation packages. Barnard, however, suggested that instead of abandoning the use of packages, they could be modified through “deliberately provoking system errors in order to make students aware of these sources of error, and to develop an appreciation of the limits of system resolution” (p. 91). By the late 80s the reporting of computer use in relation to developing experimentation skills in engineering had been directed towards uses other than the simulation of experiments, and mostly in connection with the employment of computer management of laboratory results to provide rapid and efficient data acquisition, analysis and control [6,10,11). DEVELOPMENTS
AT
THE
UNIVERSITY
OF
NEW
SOUTH
WALES
Computer simulations of engineering systems and devices have been used for undergraduate instructional purposes for almost two decades. At the University of New South Wales their use in mechanical engineering as supplements to laboratory investigation was pioneered in 1971 by De Vahl Davis and associates[l]. Software and courseware were developed at that time which incorporated simulations of a number of engineering systems and devices. These included aircraft vibration; thermodynamic performance of a refrigeration compressor; grillage design (ship construction); and pipe flow in large reticulation systems. The programs were written in APL and were designed to model systems which could not be studied in the laboratory, or whose scaled-down laboratory counterparts did not reproduce the performances and characteristics of the full scale system. It was intended that these simulated systems would be used as supplements to students’ laboratory work [ 11. In the following year a study of student use of these programs was conducted [2] and established that: students were using the programs in the manner anticipated by the program developers; the majority of students viewed them as effective learning tools; and the simulations were perceived by students as realistic models of engineering behaviour. In 1973 an evaluation study of the use of one of these packages-“pipe flow in a reticulation system”-was carried out [3]. This exercise was chosen because it was considered to be one of the more successful and instructive simulations used at that time. The findings from this study provided evidence of substantial student learning accruing from use of the simulated experiment, and that gains in knowledge (as measured by criterion tests) substantially exceeded those by equivalent student groups who had covered the same material through conventional teaching provisions. The criterion tests had been designed to measure students’ ability to determine reasonable estimates of the value of specified parameters under differing pipe flow conditions and reticulation geometries. However, our observations of students’ use of the simulation package gave rise to a concern about the extent to which simulations of this kind could fill important laboratory objectives, and particularly those relating to validation of results: “Another question of quite pervasive implications arose from our study. This was the question of validation. To what extent do students ‘trust’ their computer simulation results, the model on which they are based, and what means of verification, implicit or explicit, do they employ?“[3, p. 1421. This concern arose because it was evident that students often appeared blind to order to of magnitude errors which they would be unlikely to ignore in the laboratory or field. Also, they were often unquestioning of the limits or accuracy of the model underlying the simulation. This manifested itself in students attaching a spurious precision to their findings, although they should have been away that for several of the simulations the mathematical model only crudely approximated the actual behaviour of the system. In 1975 the third year laboratory course in “experimental engineering” was reorganized so that the students themselves were required to decide: what is to be measured; what procedures can be used to provide reliable measurements; how available equipment can be used to provide these data;
266
D. J. MAGIN and J. A. REIZFS
and how results and procedures can be validated. AS part of this reorganization, use was made of “desk-top” computing facilities within the laboratory. This allowed students to compare their experimental results with profiles of theoretical values determined by an appropriate model built into the computer program. In many instances. the use of the computer model as an adjunct to experimentation enabled students to identify errors in methodology and procedure [ 121. However, one disquieting feature we observed at that time was that “students indicated an implicit faith in the authority and validity of the computer generated profiles. Further enquiry needs to be undertaken into how students actually set about determining the validity of their own experimental data” [ 12, p 471. Evaluations of students’ work within the reorganized laboratory course in subsequent years further emphasized the importance of students developing a critical awareness of methodology and the procedures used, and how their results can be validated. For example, we had found that for an “asymmetric heating” experiment a substantial number of student groups had arrived at grossly anomalous results through the application of incorrect theory, and that “the most common reaction was to suspect errors in instrumentation. measurement or calculation. Few were prepared to initially consider the possibility of error through applying inappropriate theory” (Reizes and Magin, 1978 [ 13. p. 871). These experiences, together with those emerging from the literature at the time (and particularly from the studies sponsored by the National Development Programme in Computer Assisted Learning in the U.K.) suggested that our initial conception of the role of computer simulated experiments in the development of students’ experimental engineering skills had been largely misplaced, or at least deficient. We had come to realize that simulated experiment packages were not providing the kinds of experiences which were seen as central to the development of skills of experimentation as outlined in the Introduction. Perhaps somewhat paradoxically. given our understanding of these limitations. we were driven to consider the further development of computer simulated experiments within experimental engineering; and, in a sense, return to the aspirations held in the early 70s. This arose from attempts to address a perennial problem in connection with the teaching of “heat exchange”. Teaching heat exchange Laboratory investigation of heat exchange devices has long been regarded as an essential aspect of undergraduate instruction in heat transfer. Traditionally, experimentation has been undertaken to meet three learning objectives. First. students need some familiarity and understanding of the operation and control of heat exchange devices found in refrigerators and air conditioners. The other two learning objectives are: (i) to develop a functional understanding of the effect on performance of varying different input parameters; and (ii) to provide experience in the interpretation of experimental data output and in matching these data to results derived from analysis. The achievement of these objectives is seen as integral to the development of skills required for designing. testing and modifying heat exchange devices. From our experience, few students satisfactorily accomplish these latter two instructional objectives. The functional understanding sought (sometimes referred to as “parametric feel”) is that which enables students to manipulate key parameters to efficiently achieve optimization and control of output. This is severely constrained by resource and time limitations in the laboratory: students rarely have the opportunity to conduct sufficient “runs” of the laboratory rig to achieve other than a most rudimentary comprehension. The second area of difficulty is that of interpreting experimental data and comparing these with analytically determined values. Students often find that their experimental results are grossly at variance with those predicted by analysis. We have found that few students appreciate that the major part of observed discrepancies are most likely to arise through the limitations inherent in the modelling equations themselves, rather than through flaws in experimental methodology or errors of measurement. The developed model of the heat transfer process involves a number of steps which (from the students’ perspective) employ quite sophisticated mathematical procedures. However, embedded within the model are empirical relations which are derived from experiment. The result is that the
Computer simulation of laboratory experiments
267
final equations provide predicted values which deviate by up to f 25% from rigorous experimental measurements[l4]. Although attempts are made during lectures to instill an understanding of the essentially empirical basis of the model and its limitations, it is apparent that this understanding is not transferred to the interpretation of laboratory data. Employing
simulation
Many of the engineering devices and phenomena studied in experimentation courses involve complex dynamic processes which are not completely understood, and unable to be modelled with precision. Indeed, the raison d’@tre for teaching “experimental engineering” rests on the need to develop students’ skills in solving engineering problems of this kind-i.e. problems which cannot be resolved by means other than empirical investigation. As outlined earlier, there has been an increased awareness of the danger of employing computer simulated experiments where the simulation is based on a simplified or incompletely understood model. Yet it has also been demonstrated that simulated experiments have been used successfully in developing a “feel” for the effect of parameter change (and through this develop skills in the optimization and control of output from devices based on complex dynamic processes); and that these learning outcomes are often unable to be achieved through conventional laboratory work provisions. However, in achieving this aim, there is the danger of unwittingly promoting a false understanding of the empirical base of present knowledge about many complex physical processes (such as turbulent flow and heat transfer); and, through this, diminish the perceived importance of empirically investigating phenomena of this kind. This false understanding arises from the artificial nature of the simulation: whenever the same values for input parameters are entered into the computer, the resultant output values will remain constant. Students soon come to realize that the use of such packages quickly produce results which are repeatable and which have a similitude of precision; whilst similar hand-on experimentation is found to be time consuming, require interpretation and checking for experimental errors, and often produce puzzling and unexplained anomalies when subject to repeat experiments. It is understandable that students’ experience in using simulation packages would lead them to conclude that the use of such packages is superior to laboratory testing in determining the performance characteristics of devices such as heat exchangers. As we had reported earlier, when students are confronted with gross discrepancies between their experimental results and those predicted by (incorrectly formulated) theoretical analysis, it is the laboratory results which are discounted, whilst their formulation of theory remained unchallenged [I 51. These considerations have led us to initiate a new approach to overcome some of the problems associated with teaching heat transfer. The approach being used is that of developing computer simulation packages which produce output data similar to that which would obtain from students’ hands-on laboratory invesrigations. The decelopment
of simulated
experiment
courseware
The software and courseware being developed are part of the development of a curriculum .segment outline for the teaching of heat transfer within the mechanical engineering course at the University of New South Wales. The packages for simulation of heat exchange are designed to link and intregrate experimentation, design and theoretical analysis. This is approached through two stages of instruction. In the first stage the linkage is established between experiment and theory. Students (in small groups) carry out a conventional “heat exchange” laboratory experiment in which they are required to determine “heat transfer coefficients” (this involves operating the rig, varying parameters, measuring outputs and calculations). Their results determined by experiment are then compared with those predicted by computer modelling. Here, we introduce several different (but conventional) computer models for use by each group. Each of these packages employs different correlational equations which are commonly found in the various standard tests, and accordingly yield differing results. Also, each student group, through the application of these different models, is required to discuss and report (i) differences found between their experimental results and those determined by computer for the different correlational
268
D. J. MAGIN and J. A. REIZES
equation models, and (ii) differences in results produced by the various models when using the same input data. The provision of several different computer-processed models (instead of a single model) has been made to enable students to develop an appreciation of not only the value of, but also the limitations inherent in using modelling to predict performance; and. as a consequence, increase their awareness of the importance of empirical testing and validation of results. The next stage consists of two parts. First, students carry out a standard CAD exercise in which they are given the task of using a computer simulated package to design a heat exchanger to meet specified performance criteria (unique to each student). The package essentially employs one of the process models used in stage one. After completion of this exercise, the students are advised that they can use a specially developed simulation package to “build prototypes” to their design specification, and “test” each prototype. Here, students are made aware that the simulation package has been designed to reproduce as faithfully as possible the performance of actual prototypes built to normal manufacturing tolerances; and that the simulation of the “testing” phase has been designed to reproduce as closely as possible the results which would obtain from rigorous testing procedures. The production of the software for this special package is currently under development. Here, the major task is one of simulating the behaviour and obtained measurements from an actual device, rather than simply simulating the heat exchange process. In brief, this entails the development of a program which: mimics the fluctuations in output values which arise through variations occurring randomly (over short intervals) in convective heat transfer rates. This is achieved through the use of a pseudo-random function; (ii) takes account of the fact that prototypes built to the same design specifications usually exhibit consistent individual differences in performance characteristics. Part of the explanation for this lies in the cumulative effects of small deviations (within manufacturing tolerances) from design specification, although other factors, as yet unknown, are considered to contribute to these idiosyncrasies. To mimic this phenomenon, the program builds in for each individual “prototype” a percentage variation from values determined by the correlational equations. This variation is both unique to each individual prototype, and consistently employed in ail subsequent “test runs” of the designated prototype; (iii) provides for the normal kinds of “experimental scatter” found in laboratory experimentation. This scatter is built into the program by applying probability density functions to all input and output parameters which are subject to measurement during experimentation. In practice, the effects of these variations are small compared with those produced through (i) and (ii) above. (i)
Interaction with this package will result in students being faced with interpreting a situation which incorporates many of the contingencies of actual experimentation. They will, for example, be confronted with the fact that repeat test runs on the same prototype will produce different results. They will also find that two heat exchangers, “built” to the same specifications. will exhibit distinguishably different performance characteristics. We see this exercise as providing a means of fulfilling learning objectives which hitherto have not been successfully met. In particular, we believe that the experiences incorporated within the second stage will enable student to: (i) appreciate the need in many engineering situations to test designs through building and testing prototypes; and provide experience in carrying out the steps involved in testing in a way consistent with those which would be undertaken in a laboratory; (ii) develop an awareness that prototypes, even when carefully built to specification, can vary in their performance characteristics, and can depart substantially from the output values predicted by modelling processes; and (iii) realize the inherent variability of the heat exchange process, and the consequent role of experimentation in understanding such processes. It is further anticipated that student interaction with computer simulated experiments in the manner described above will provide a means by which design, experimentation and analysis are integrated within a curriculum segment.
Computer
simulation
of
laboratory experiment5
269
DISCUSSION The description of developments in our own work relate to a specialist part of engineering. However, the implications which flow from successful implementation of this approach reach much wider. These can be illustrated from several different perspectives: Experimenrariun
Within engineering and the applied sciences, experimentation has continued to be beset by difficulties which, hitherto, have nor been adequately resolved. Here, we refer to a form af laboratory investigation regarded as essential to many of these fields, namely the operation and control of devices to produce specified performance functions. Usually, these incorporate iterative optimization procedures based on data feedback and a qualitative understanding of the effect of varying key parameters. This is rarely achieved satisfactorily because resource constraints and time limitations severely curtail the number of test runs which can be made. A second difficulty concerns the issue of validation of results. As we have previously outlined in reference to earlier studies [ 161studenls rarely see validation as problematic. That is, experimental results and their interpretation are seen as verifiable through repeat measures and calculation checks; and, where available, through comparisons with theoretical results. Few students develop an understanding that the essentiaf point of experimentation is that it provides information which is not obtainable by analysis. Early proponents of the use of computer simulated experiments saw a potential for such packages to provide some remedy to these two perennial problems. Although, as we have indicated, packages of this kind have been shown to be successful in developing an understanding of the effects of parametric change, and in providing a mechanism for determining correspondences between expehmental and theoretical results, they do so at considerable cost. That is, they have an allure which distorts the students’ understanding of the empirical basis and contingent nature of a large part of current engineering knowledge. The approach we have embarked upon is seen as combining the learning opportunities of traditIona simulation packages in a way which will enhance, rather than detract from, the students’ understanding of the nature of experimentation. Integralian of theory, design and experimentation
Many of the engineering sciences have not found ways to adequately integrate design, analysis and experiment within their curricula. This arises because in many instances it is simply not feasible to acruaIly build prototypes and test them in the laboratory. Both for this practical reason, and because of long traditions within the different disciplines, there has been a separation of design training from experimentation. This results in few students in undergraduate courses being set tasks which formally require them to integrate their developing knowledge and skills in these two areas [ 171. The instructional design described in this paper provides a mechanism for achieving a nexus berween theory, design and experiment in a segment of the curriculum, and through thrs provide for ski11 devefapmenl and knowledge which wiII add to an understanding of the complementary nature of design and experimentation. A new role for computer simulation
Computer simulation packages which model dynamic processes can be used in two quite different ways. First, the simulation may be seen as based on a simplified, tendential model of behaviour, in which the learning task is one of developing an understanding of basic concepts and a “feel” for the likely effects of parametric change (e.g. “population growth” simulations as described by Henderson et af. [18]). Here, students are unlikely to confuse the simulation for the “reality”. A second approach, often used in engineering, is that of using the simulation as a tool for analysis, control or design. These simulations incorporate “state-of-the art knowledge” to predict actua1 bekavivur under different conditiorzs. ln our view, where such models have been found ta consistently provide accurate representations of behaviour, the ratianale for engaging in experimentation is lost.
D. J. MAGIN and J. A. REIZ~
270
What we have proposed is an approach which employs simulations where the underlying processes are imperfectly understood, are unable to be modelled accurately, and where they are used in conjunction with hands-on experimentation. Packages of the kind we have outlined earlier, with appropriate courseware that builds in the contingencies of actual experimentation, can provide learning experiences contributing both to the development of experimentation skills and to a full appreciation of the crucial role of actual empirical investigation. CONCLUDING
REMARKS
The packages currently being developed will be subject to pilot trials and modifications in 1989. It is anticipated that (with modifications) the packages and instructional design as outlined in this paper will be incorporated in reorganized curricula scheduled for 1990. Contingent on a successful outcome to our work, we believe that the approach described will provide a means by which computer simulated experiments can realise the potential that was envisaged almost two decades ago. REFERENCES W. N., The use of APL in engineering education. Sys/ems Druelopmenr Insfirure Monograph (IBM), Canberra, Form No. SDI-005 (1971). Forth I., Simulated experiments in engineering: a pilot evaluation. TERC R&D Papers, 72/20, Tertiary Education Research Centre (UNSW), Kensington (1972). Reizes J. A., Holmes W. N. and Magin D. J., Use of interactive terminals in computer simulation exercises-some educational developments at the Universitv of New South Wales. Proceedings ofa National Confbence on Engmeering Educarion. pp. 138-142. Institution of Engineers. Australia. 7313, Sydney i1973). Tawney D. A., Simulation and modelhng in science computer assisted learning. Technical Report No. II, National Development Programme in Computer Assisted Learning, London (1976). Faucher G., The role of laboratories in engineering education. Inr. J. Mech. Engng Educ. 13, 195-198 (1985). MacAlpine J. M., Computers in the teaching laboratory: experience with electrical engineering programs. Computers Educ. 12, 401406 (1988). Barnard R. H., Experience with low cost laboratory data. hr. J. Mech. Engng Educ. 13, 91-96 (1985). Shacham M. and Cutlip M., Authoring systems for laboratory experiment simulators. Compurers Educ. 12, 277-282 (1988). Laurillard D. M., The design and development of CAL materials in undergraduate science. Compurers Graph. 2, 241-247 (1977). Lai J., An automated data acquisition system for experiments on fully-developed pipe flow. Inr J. Mech. Engng Educ. 14, 83-96 (1986). Wepfer W. and Oehmke R., Computer based data acquisition in the undergraduate laboratory. Compurers Educ. 11, 21-32 (1987). Magin D. J., Reizes J. A. and Sivyer P. H., Fostering student initiative in the laboratory. Proceedings of a Narional Conference on Engineering Educa/ion, pp. 43-47. Institution of Engineers, Australia, 76/8, Melbourne (1976) Reizes J. A. and Magin D. J.. Improving experimentation skills: an evaluation of a reorganized course in experimental engineering. Proceedings of a National Conference on Engineering Education, pp. 8689. Institution of Engmeers. Austraha, 7816, Sydney (1978). Holman J. P., Hear Transfer, 6th edition. McGraw-Hill, Singapore (1986). Magin D. J. and Reizes J. A., Teaching experimental engineering in the laboratory: outline and evaluation of a course at the University of New South Wales. Inf. J. Mech. Engng Educ. 7, 49-54 (1979). Magin D. J., Confidence and critical awareness as factors in the development of experimental skills m laboratory courses. Higher Educ. 13, 275-288 (1984). Magin D. J., Churches A. E. and Reizes J. A.. Design and experimentation in undergraduate mechamcal engineering. Proceedings of a Conference on Teaching Engineering Designers, pp. 96100. Institution of Engineers, Austraha, UNSW, Sydney (I 986). Henderson E., Kinzett S. and Lockwood F., Developing POPTRAN, a population modelling package Br. J. educ. Tech&. 19, 184-192 (1988). Harrison D. and Pitre J., Computers m a teaching laboratory: just another piece of apparatus. Compurrrs Educ. 12, 261-267 (1988).
I. De Vahl Davis G. and Holmes 2. 3.
4. 5. 6. 7. 8. 9. 10. Il. 12. 13.
14. IS. 16. 17.
18. 19.