Simulation software in a life sciences practical laboratory

Simulation software in a life sciences practical laboratory

Pergamon S0360-1315(96)00011-5 Computers Educ. Vol. 26, No. I-3, pp. 101 112, 1996 Copyright © 1996 Elsevier Science Ltd Printed in Great Britain. A...

894KB Sizes 20 Downloads 71 Views

Pergamon

S0360-1315(96)00011-5

Computers Educ. Vol. 26, No. I-3, pp. 101 112, 1996 Copyright © 1996 Elsevier Science Ltd Printed in Great Britain. All rights reserved 0360-1315196 $15.00 + 0.00

SIMULATION SOFTWARE IN A LIFE SCIENCES PRACTICAL LABORATORY ERICA MCATEER, I D O U G L A S NEIL, 2 N I A L L BARR, 2 M A R G A R E T BROWN, I STEVE DRAPER t and FIONA H E N D E R S O N l Department of Psychology and 2 Institute of Biomedical Life Sciences, University of Glasgow, Glasgow G I 2 8RT, Scotland

INTRODUCTION

Several general reasons can be put forward for using simulations in a teaching context: • They provide a safe environment within which students can test hypotheses and study outcomes. • Students can use the software out of class times for reinforcement or revision and self testing. • Teacher or demonstrator contact time can be diverted to other tasks. In Life Science teaching, there are further compelling reasons for introducing simulations into practical classes, in substitution for traditional "wet" labs: • They overcome the need for multiple sets of specialized or expensive equipment. • They allow students to concentrate on biological principles, rather than techniques. • They enable students to perform sophisticated experiments which otherwise requirei high levels of physical or technical skill. • The use of live creatures is avoided. Conversely, there are quite sound counter claims in favour of traditional labs. These relate primarily to the value of "real life, hands-on" experience, leading to the development of appropriate professional skills, and familiarity with particular tools of the trade for life scientists, e.g. m e a ~ r i n g and recording instruments. From our own experience at Glasgow over two and a half years, we can confirm that the last of the listed claims above is very important for students, and that all are important for teachers, departments and planning units under current funding arrangements.

THE G L A S G O W STUDY Two simulation packages were integrated into a third year practical course on Animal Physiblogy in the University of Glasgow's Institute of Biomedical Life Sciences. This course ran over 5 Weeks, providing fourteen 3-hour practical labs with associated lectures to 66 students studying for d+grees in Zoology or Aquatic Bioscience. The full course content provides one (compulsory) third of a degree exam paper. The laboratory exercises aim to complement the lectures by giving practical experierlce of scientific principles covered, to illustrate the techniques and procedures involved in practical aspects of physiology, to give hands-on experience of investigative experimental work and to provide real data for handling, analysis and interpretation. In February 1994 and January 1995, with two cohorts of students, the implementation pf the simulation software within the laboratory classes was evaluated in a collaborative exercise be~tween teachers, students and the evaluation team. In 1994, not only was the courseware being used for the first time, but the course itself was new, being a combination of two previously separate modules: Comparative Physiolog~ and Neuroscience. Furthermore, the evaluation tools were still being developed at that time, w~th the 101

102

ERICA MCATEER et al.

ffthe anions fundcstlons migrate at exactly the same rate, then their dlffualon has no electrical ¢on|equences, I.e. no POTENTIAL DIFFERENCE Is generated.

However Wthey migrate at dlfl'emnt rates thls leads to a npwratlon of chlrge, and thus a potentlal dlfference II generated between the recordlng electrodes on the two aldu of'the balh. "rhls Is called THE DIFFUSION POTENTI/U.

i

Diffusion potential :Page 2 Fig. 1. Screen from the Nernst and Goldman tutorial package.

aim of causing minimum intrusion into the class work, while providing appropriate feedback about the integration of the simulation material within the course [1, 2]. The simulation software Two packages were used, both of which conform to standard Microsoft Windows user interface guidelines. One, Nernst and Goldman, was produced in-house under TLTP funding [3]. It consists of a series of interactive exercises, with graphical illustrations and simulations, introducing students to the basic principles of electrochemistry:ion mobilities, diffusion and membrane potentials, and the Nernst and Goldman equations. Simple experimental sections reinforce the main points, and provide the students with control over parameters such as the salt used, the concentration gradient across the chamber and the temperature (Fig. 1). The second package, Neurosim for Windows, was written by Dr W. J. Heitler at the University of St Andrews [4] and is published by Biosoft Limited. It provides advanced simulations of neurobiological processes (membrane and action potentials, threshold and refractory period, voltage and patch clamping, the effect of neurotoxins) and takes the form of interactive experiments in which the students are free to change any starting condition and observe outcomes on a graphical display (Fig. 2). A tutorial mode allows the teacher to set specific parameters for different experiments which are invoked when the student progresses from one exercise to the next. The laboratory class required students to work through the Hodgkin-Huxley model of the nerve action potential, and to examine the effect of changing stimulus conditions and ionic relationships and of applying different neurotoxins. The classroom situation The students were organized into nine groups of six or eight, pairing up existing work groups. The laboratory classes were set up in "round robins" circuits of seven stations, six of which were traditional "wet" experiments and one of which was a simulation. Two such circuits were run successively over the period of the course, with Nernst and Goldman in the first and Neurosim in the second. Being a more basic teaching resource than Neurosim, Nernst and Goldman provided the students with an introduction to working in the simulation mode. At the simulation lab station, three PC computers were available and students were encouraged to work in pairs. In practice

Simulation software in a life sciences lab

~le

103

Neurophyslology simulation 2

EdR .View _Q.pUons Chapter contents

"Drugs

1* I

Rec~v~h~9 P,~tch channels count 1

IF71-1X

U

I--~TEA FIScorpTx

U

U

If-

Llq: 10.

tNol: 41o. I

I

0.0 ms

Z_5

I

5

Fig. 2. Experimental scope window from the Neurosim package.

many chose to stay in their normal work groups of three or four, while a small number preferred to work alone. All the lab stations were installed in one large undergraduate teaching laboratory room, each with a tutor or demonstrator in attendance. Lectures on related concepts and biological principles were provided during the hour following or preceding each day's 3-hour practical session. At the end of each round robin circuit, a meeting of the whole class was held in order to review and discuss the experimental results and to reiierate the underlying principles. Immediately following the complete course, the students took a class test requiring brief responses to sixty questions on lecture and lab content.

Evaluation procedures Within the laboratory class itself we limited our intrusion upon the students' time and workSpace to two short paper instruments and the observation of a working group through each monitored lab by a trained evaluator--with the students' prior agreement. One paper instrument was the confidence log--a checklist of specific learning objectives for each lab, compiled by the lecturer in charge. These covered different types of learning goal (knoWledge of terms, grasp of underlying principles, interpretation of outcomes) as well as the procedural and practical skills that are a major target of practical laboratory teaching [5]. Students ranked their current feeling of confidence about being able to meet each objective along five positions from "no confidence whatsoever" to "very confident". These were completed at the beginning of class and again before leaving the lab at the end of the session. The other instrument was a one page post-lab questionnaire which was given to student s with the second confidence log after the class. It asked the students about their task experiences Within that day's session, both in quantitative style ("circle the appropriate a n s w e r . . . " ) and with open ended questions and probes. Away from the classroom and with the cooperation of the students and the staff more information was obtained in the following ways: (1) Following the final class meeting students completed a post course questionnaire which

104

ERICA MCATEERet al. sought information on the students' attitudes and experiences at that point as well as raising some issues which had been of particular concern to staff. (2) During the period of the course itself illustrative feedback was provided by informal interviews with students and with teaching staff. (3) A delayed confidence log was administered to students three months later, shortly before the revision period for the degree examinations. (4) Class test results and relevant degree exam grades were also available for study.

During the 1994 exercise, which had initially focused on student groups when they were working at the simulation lab station, the teachers for the traditional labs agreed that their exercises should also be monitored to provide contrastive information. They found the feedback useful and interesting so far as their own lab teaching was concerned and were enthusiastic about this being repeated more formally in 1995. Within each round, therefore, two "wet" lab stations were monitored in exactly the same way as the simulation lab. An attempt was made to select labs that differed between "closed" task exercises, with set procedures for achieving task goals, and "open" or problem-solving tasks. For example, one of the wet labs, Insect Interneurones, investigated the way in which a single interneurone in the central nervous system of the locust codes visual information. Although the procedures the students had to follow for setting up the experiment were necessarily prescribed, they then had to devise their own methods of producing controlled visual stimuli. Another wet lab, Ionic, examined changes in the concentrations of two major ions (Sodium and Potassium) in the blood and the muscle of a crab during adaptation to differing salinities. This lab is "recipe led" with students taken through precise steps toward obtaining results and drawing conclusions. The rest of this paper summarizes the broad results of the evaluation exercise, from the perspective of the overall aims of the whole course and the list of claims made for simulation and for traditional labs. Space permits only a limited discussion of our findings; fuller information can be obtained by contacting the first author of this paper. RESULTS AND DISCUSSION Observations

Watching the students engaged in their tasks across the laboratory room, a subjective impression was that the simulation station was very much "one of the labs" rather than specifically "a computer assisted learning exercise". One of the other physical labs provided a computer running a data-logging system (MacLab) for students to record physiological responses (the rate of heartbeat and ventilation of a crab). In much the same way, the simulation lab students were using their machines as tools for study. Naturally the physical aspects of tasks varied strongly--at the wet lab stations the students would work in their original subgroups, taking different roles as the tasks demanded and, where possible, taking turns. At the simulation stations, though still working in groups, there was obviously much less walking about, and less division of subtasks. Nevertheless the students would share between setting a question, operating the mouse and noting the results, and they would switch roles from time to time. There was no discernable difference between the simulations and the wet labs in task-centred dialogue and social interaction between students, and between teachers and students. The post-lab questionnaire

Responses to the questionnaires completed by students after all monitored labs generally support this impression of similarity whilst illuminating certain differences. One common outcome was from the question "did you need to seek much help during the lab?" This received an affirmative response from over 80% of students regardless of whether they had worked through simulated or physical exercises. The type of help needed did differ with the type of lab, e.g. 72% of calls stated during the Insect Interneurones were for help with preparation and dissection whilst with Neurosim 70% concerned scientific principles. All the labs were stated to be "manageable" so far as pace and workload were concerned, and most of the students got through all their scheduled exercises.

Simulation softwarein a life scienceslab

Learning to use instruments/equipment/computer?

105

Learning and understanding subject material? 18

18 .,-

16 I

16 14

14

8 12

8 ~

["]Ionic ~Insect Inter ~Neurosim

10

m 8 I

~ 8 °

4

4

0

0

Time

100%

Time

100%

Fig. 3. Percentage of student time reported as spent on different lab activities.

One question asked the students what they spent most of their time doing during the lab, providing them with a set of possible responses against which to put a "rough percentage" ~here appropriate. Figure 3 illustrates the reponse pattern for this question for three of the labs. ionic was the "recipe led" lab, Insect Interneurones required students to work out their own way of providing suitable visual stimuli to trigger the response from the perceptual system of a locust, Neurosim was the simulation lab using the Hodgkin-Huxley model. Emphasis on learning and understanding the subject material is clearly stronger for most Of the students when they were working through the simulation lab. Very little time is reported as being spent on learning how to use the computer and the pattern for the other simulation lab, Nernst and Goldman, is very similar. Predictably, for the wet labs, more time was spent learning tO use instruments and equipment--this was greater for the Interneurone lab than for the recipe-led ]Ionic lab. The information provided by the students backs up overall impressions gained from our observations of group exercises at individual stations. Both wet and simulation labs, on this evidence of time spent on specific activities, target different learning aims of the course. Taking the same labs, students providing "likes" and "dislikes" about their morning's activities liked the guidance and straightforward direction of the Ionic lab, but not the use of live ahimal tissue. This last was repeated more firmly and more often for the Insect Interneurones lab, in fact the course tutors have decided not to use live locusts for teaching undergraduate physiolqgy in future. What students liked here was being able to design their own test methods. With the two simulation labs, students liked getting results clearly and quickly and found the computer e~sy to use for this, on the other hand there seems to have been rather too much information sometimes, which could cause confusion. As one student wrote, "you can't ask a computer to rephrase what it is saying". Sixty-seven per cent of the students said that they would use the simulation packages ~gain, most stating that this would be for revision purposes. However, we were not able to carry out the necessary observations, or set up logs to determine whether such intended use actually took place. Another question in the post session questionnaire asked students what they had learnt I from that practical lab. Responses varied predictably across labs with, for example, "How to" statelnents relating to the use of equipment and measures from the Ionic lab, dissection skills and statements of fact about locusts' visual systems for the Insect Interneurones lab and a mixture of statements from Neurosim, the majority relating to "cause and effect". There was also some content asso~ation between these lists and responses on time spent in task activity, as well as the "help" calls: ! Ionic:

• How to use photometry equipment.

106

ERICA MCATEER et al.

• Practising solution make-up and operation of flame photometers and ion regulation in crabs. • How to use instruments, write graphs, something about crab ionic regulation.

lnsect Interneurones • A lot about the visual response of neurones to different types of stimuli. • How to make and use suction electrodes, dissection and manipulation skills. • How to devise quick experiments to understand the visual neurones and their stimulation. Neurosim • Voltages and conductance of ion channels. • Effect of drugs on action potential and channels. • How the action potential is generated. • Not sure e x a c t l y . . . Confidence logs Our interest was in patterns of increase and decrease in confidence over time, and whether this might be explained by specific objectives, or even the kinds of learning they may depend upon. Using data only from those students who had indicated their confidence position at each of the three logging times, we studied the outcomes for each lab. Statistical analyses were undertaken using the Friedman Analysis of Variance by Ranks, followed by Wilcoxon signed ranks tests. We found that there were systematic differences in response pattern which might relate to common factors across the labs, although presently our interpretation of evidence is speculative. The indications are that confidence over objectives that depend upon "rote" memory, e.g. knowledge of terms or abstract formula, increases strongly between pre- and post-session logs but falls back sharply after 3 months, often to near the baseline level. Confidence over objectives relying on a grasp of underlying scientific principles and concepts tends to hold more closely to the level of post-session logging, with no significant drop. Confidence about procedures and practical skills shows different shift patterns for different objectives. Examples are given here in Figs 4 and 5 which show mean rankings given by students for the Insect Interneurones and the Neurosim lab. A general difference between the two charts is that the baseline confidence seems to be consistently higher for the simulation lab, which might reflect differences in students' reaction when confronted with a computer screen, keyboard and mouse rather than an array of totally unfamiliar equipment and a jar containing live locusts. The baseline confidence levels for the other simulation lab, Nernst and Goldman were also generally higher than those for the two monitored wet labs in that circuit. All the students had gone through a basic computer skills course (word processing, spreadsheets and data analysis) on entering third year, as well as having had some experience of computer supported learning material during their previous year's classes. To illustrate some of the usefulness of confidence logs from the teacher's perspective, we will describe patterns of increase and decrease for some objectives, from the two labs providing the data for Fig. 2. Relevant statistical tables are given in Appendix A. For objectives relating to information which should already have been held by the students before they began work on this course---e.g. Objective 1 for both Insect Interneurones and Neurosim--there was no significant difference in rankings between baseline, post-session and delayed logs in either case. For objectives relating to procedural knowledge---e.g. Objective 4 for both Insect Interneurones and Neurosim--there was a large increase from a low start point between baseline and postsession, but no significant drop at the delayed log. If confidence is a good indicator of learning in this study (and this still needs to be established--see below), the teachers might feel that in both cases their students' lab work has targeted the particular learning goal. Looking at the wet lab, Insect Interneurones, Objective 8 ranks very low, increases significantly then drops. The decrease between the post-session and the delayed logs is greater than would have been predicted by chance, although the mean ranking is still significantly higher than at the start. This is one of the "underlying scientific principles" that the lab should have been reinforcing.

Simulation softwarein a lifescienceslab

107

very confident

confident

some confidence

little confidence no confidence whatsoever

1

2

3

4

5

6

7

8

9

10

11

learning objectives

Please indicate, by ticking in the relevant box, how confident you feel that you can explain, or demonstrate: 1 2 3 4 5 6 7 8 9 10 11

The difference between extracellular and intracellular recordings of action The principle of operation of a differential pre-arnplifier How to expose the pro-thoracic interganglionic connectives of the locust How to attach a suction electrode to a nerve How to identify the activity of a single axon in a multi-unit recording The principle of operation of an insect compound eye How to present the eye with controlled visual stimuli The response characteristics of the DCMD visual intemeurone How to determine the visual field of a visual interneurone The concept of a 'looming'- stimulus The concept of habituation of a neuronal response

potentials

Fig. 4. InsectInterneuronlab: meansof student confidencerankings.

Generally these, when grasped, might hopefully be retained longer than "rote learned" items. Did the procedures and design issues of this lab obscure the target of the exercise as time went by? Or did the objective somehow "read differently" when at some distance in time and space from the experimental setting? To the teachers involved, the patterns of difference made sense in terms of the learning objectives set or made them think again about the objectives themselves and their own teaching strategies. This was felt to be a good thing--though presumably there could be too much of it! It is, however, clearly important to try to establish whether there is any relationship between self-confidence ratings and performance on listed learning objectives. Realizing that no such direct comparison would be possible from the data available to Us, we looked more broadly for associations between confidence and general performance [6]. The levels of confidence given by students on each of the day's learning objectives following each lab could not provide a suitable value against which to measure performance in the class test as these had been gathered at different times for different student groups over a 4 week period. We took each student's mean self-rating at time three, 3 months after the course, when the confidence logs for all the labs were presented together preceding lectures and found that this correlated with their percentage marks in the degree exam: r~ = 0.327, P = 0.01.

Post-course questionnaire Information relating directly to the "traditional vs simulation lab" question came from responses to the final, post-course, questionnaire. These addressed most claims made for and against simulation software in practical labs, as well as illustrating how these students stood ira that

108

ERICA MCATEER et al.

very confident [-]pre-lab

I I post-lab

In+ 3 months

confident

some confidence

little confidence no confidence whatsoever

1

2

3

4

5

6

7

8

9

10

leamin 0 objectives Please indicate, by ticking in the relevant box, how confident you feel that you can explain, or demonstrate: 1 2 3 4 5 6 7 8 9 10

The threshold of the action potential The refractory period after an action potential The changes in Na and K conductances during an action potential Patch clamp recording methods The concept of voltage-gated ion channels The probabilistic behaviour of channel gates The relationship between microscopic and macroscopic currents The activation and inactivation of channels The effect of TTX on the action potential The purpose of voltage clamp experiments Fig. 5. Neurosimlab: means of student confidencerankings.

controversy. A summary of responses to all questions is given on an example of the questionnaire form as Appendix B; here we will just cover this one issue. Ninety per cent of the students replied "yes" when asked whether the inclusion of simulations in this kind of course was useful. The following examples come from responses to our "Why?" probe: • Quickly and easily allows you to set up experiments and get results. • It explains things clearly, don't get bad results as in other practicals. • Allows you to experiment with difficult techniques easily and obtain good results from which you can draw conclusions. • Because it introduces us to how things will be in the future and more importantly it cuts down on animal experimentation. When asked how they would feel if simulations replaced all labs, the students responded: • • • •

Cheated. Disappointed. You should still get some hands on experience. The labs would be too boring. It wouldn't be good because there would be no real way to connect what you are learning to the living tissue of an animal. • Hands on experience in setting up experiments and running them is essential. • I think they should replace the labs which give distress to the animals involved.

Assessment outcomes So far as evidence relating students' work on the practical course to their assessment grades is concerned, some information was obtained from their scores on the class test given immediately

Simulation software in a life scienceslab

109

Figure A shows the changes in membrane voltage that occur during the action potential in an axon, when stimulated as shown.

Pt.l: Draw on Fig. A the time courses of the changes in membrane conductance for Na+ and K+ that underlie this action potential. Figures B and C show the voltage and conductance changes that occur in the presence of two different neurotoxins.

Pt.2 What effect does each toxin have on the ionic conductances? Pt.3 Identify the two toxins. Fig. 6. LIII Degreeexam practical question from Module 2c (Physiology).

following the course. This test normally addresses teaching content from both the lectures and the labs, and tends to focus more on lecture material. However in 1995, for each of the monitored labs teachers set one question which they felt depended on lab rather than lecture work. Generally the students did fairly well on these practical questions, with an overall tendency to score at a similar level as for questions related to lecture material--with one or two extreme exceptions. No contrastive study is possible for outcomes from the year's Degree exam, as the relevant paper, sat by both Zoologists and Aquatic Bioscientists, provided just one question from the Physiology practical course. However, this question related directly to material covered in the Neurosim lab, and for that reason is relevant. The paper presented students with images of the "experimental results window" from the Neurosim simulation, at a point during an experim6ntal run. There were three parts to the question, which sought different types of response (Fig. 6): Part 1 involves visual recall, but also requires deduction (in this case from the delay between, and relative height and shape of curves), demonstrating a grasp of underlying principles. Part 2 asks for a short written answer demonstrating interpretation of consequences. Part 3 asks for straight recall, remembering the names of the toxins that have the effects shown in Part 2. A "facility value" (FV) was derived for each part question by dividing the average score by the potential score. FV for part one was 0.66, for part two 0.60 and for part three it was 0.30--in other words, the students as a whole did only half as well on that part of the exam question which targeted "rote learning" as on those which required a grasp of underlying principles and Some interpretation of outcomes. This predictable (and acceptable) outcome echoes patterns found over time in the self-rating confidence logs on learning objectives with regard to types of learning targeted. No correlation was found, however, between students' self-ratings of confidence at time three on the relevant learning objective (No. 3, see Fig. 5) and performance on Part 1 of the ~xam question, which targets that objective. CONCLUSIONS AND PERSPECTIVE Broadly, we feel that putting the simulation exercises into programmed laboratory time, rather than simply making them accessible on an open access computer cluster, worked well. The packages used, Nernst and Goldman and Neurosim, lend themselves to this type of use since they present the students with experiments to perform, rather than being "look and learn" tutorials. Also, appropriate levels of teacher support were provided, and the three-way interaction between the students, teacher and computer package seemed effective [7]. It is not easy to assess the learning gain from a practical skills course without requiring the students to demonstrate those skills, but this is not how this course was assessed. Neither is it

110

ERICA MCATEER et al.

generally possible to study learning outcomes which relate directly back to a particular piece of teaching, as most final (and therefore crucial) course assessments try to target learning from a full course with comparatively few assessed activities, very often involving writing rather than doing. The degree exam question described above was unusual in this respect, as it was based on material presented only in the Neurosim lab. In order to overcome these limitations, (also to provide a better estimate of confidence logs as evaluation tools) it is planned for 1996 to compile a class test in which the questions relate more precisely to the lab objectives used in the confidence logs. As far as the whole course is concerned, the assessment outcomes were acceptable to the teachers, and compatible with assessments made on this cohort of students from other coursework. Yearon-year comparisons are not yet possible (since the course appeared in its present form only in 1995), and may never be so if proposed curriculum changes in the Institute are implemented. However, the information gathered will be useful in planning these changes, especially with regard to putting greater emphasis on C A L exercises. This is an accelerating trend in U.K. universities, since there are increasing pressures to reduce the practical content of courses in the Biological Sciences. This is due to the vast increase in student numbers (800 taking Level 1 Biology courses at the University of Glasgow in 1995, compared with 400 in 1992), insufficient space to accommodate them, and a lack of funds to purchase the necessary equipment and consumables. We can envisage an increase of computer mediated learning in undergraduate courses, involving a progression from tutorials through more flexible interactive packages to simulation and modelling software. These should not only be of a high standard, but should be adaptable to meet the needs of different teachers or courses. Following our experience over the past two years with this course, we also feel that there has to be some significant element of practical, physical work where students use the instruments and procedures current in the subject field under study. Whilst there was evidence from the simulation labs that students felt able to concentrate on biological principles rather than techniques, the actual aims for this course required that they did both, hence the lecture-with-lab format. In their different ways both the simulations and the wet labs made the lectures "come alive" for the students. The academic staff feel that both modes are necessary, since although m a n y life science students will take up careers which do not require them to use the physical skills gained from practical work, a proportion will do so, often as a direct result to being exposed to the practicalities o f biological research during their lab work. O f course, familiarity with the process of computer simulation is itself relevant in this context, since it is a technique being used increasingly in the research field. Acknowledgements--Work reported in this paper was carried out within the University of Glasgow's Teaching with Independent Learning Technologies project, (TILT), an institutional initiative funded by the Teaching and Learning Technology Programme (TLTP) set up by the U.K. Higher Education Funding Councils and directed by Gordon Doughty

of the University's Robert Clark Centre for Technological Education. We also acknowledge the willing support given by teaching staff, lab technicians and students within the University's Institute of Biomedical Life Sciences (IBLS), without which the study could not have taken place. REFERENCES 1. Barr N.S.F., McAteer E. and Neil D.M. CBL in the laboratory. Life Sci. Educ. Comput. 5, 11-13 (1995). 2. Draper S., Brown M., Henderson F. and McAteer E. Integrative evaluation: an emerging role for classroom studies of CAL. Computers Educ. (Ms 166). 3. Barr N.S.F., McAteer E. and Neil D.M. Integrating computer-based learning with conventional laboratory teaching. TLTP Newslett. 2, 16-17 (1994). 4. Heitler W.J. Neurosim: programs for teaching cellular neurobiology. Paper presented to a meeting of the Society for Experimental Biology at St Andrews University (1995). 5. Bloom B.S. Taxonomy of Educational Objectives. McKay, New York (1956). 6. Stanton H. Independent study: a matter of confidence. In Developing Student Autonomy in Learning (Edited by Boud D.), pp. 81-93. Kogan Page, London (19.9?.). 7. Shaw R. (Ed.), TILT report 3: Using Learning Technologies: Interim Conclusions from the TILT Project. University of Glasgow (1995). APPENDIX

A

Table of Statistical Results relevant to Confidence Log comparisons described in this paper nb--~ritical level for significance taken is P<0.01

Simulation software in a life sciences lab

111

Neurosim Lab Objective 1." The threshold of the action potential Friedman test: )~r2=10.11 (d.f. 2) P = N S Objective 4." Patch clamp recording methods Friedman test: Xr2=34.78 (d.f. 2) P<0.001 Comparisons: prevs post: W=38, Z = 5.37 P<0.001 post vs delayed W= 7, Z = 2.04 P = NS pre vs delayed W=24, Z=3.70 P<0.001 Insect lnterneurone Lab Objective 1: The difference between extracellular and intracellular recordings of action potentials Friedman test: Zr2=4.10 (d.f. 2) P = N S Objective 4: How to attach a suction electrode to a nerve Friedman test: Xr2=35.27 (d.f. 2) P<0.001 Comparisons: prevs post: W=29, Z=4.70 P<0.001 post vs delayed W=9, Z=0.02 P = N S prevs delayed W=25, Z-4.37 P<0.001 Objective 8: The response characteristics of the DCMD visual interneurone Friedman test: ~2=35.95 (d.f. 2) P<0.001 Comparisons: pre vs post: W=29, Z=4.70 P<0.001 post vs delayed W=20, Z=3.70 P<0.001 prevs delayed W= 18, Z=3.68 P<0.001

APPENDIX Animal Physiology Course 1995

Group

B

Matric No.:

Thanks to all for helping with the evaluation of this course. We're asking you to take a last few minutes to thinki back over the past four weeks and answer some general, open questions aboout the course. Please read the questions below then answer each of them, from your own point of view. If you wish to comment further on anything, please use the !other side of this form. Did you enjoy any particular lab(s) more than others? If so please name them, and give any reasons you can think of:

Bomb 13, Hameo, Resp, Hypox, Ion 4. Sim 3 . . . "understable . . . . straightforward . . . . worked"

What (if anything) in the course as a whole increased your interest in the biological sciences--or decreased it, and why? increased." equipment, techniques 6 decreased." live animal experiments 7 physiology 6 chemistrylphysiol 3 After graduation, do you plan to work as a professional biologist? yes 25 If so, in what area? Conservation 11 Research 7 Marine 6 Teach 4 Ecology 3 Vet 2 If not, what sort of career will you aim for? What aspects of the course interested you most, and why'? What aspects interested you least, and why? see overleaf At all the lab stations you worked in groups and often had to share the work between you. Can you make some comments about group size, task interaction, and task distribution, from your own experience? Most of the lab tasks were of the "follow the recipe" type with procedures listed as steps to go through. Did this suit you? Why? Would you have preferred more problem based practicals, where you had to find your own ways of meeting task goals? Why? Would a demonstration of each practical have been as useful as doing it yourselves? Why? Were there too many, too few or just the right number of labs? Is the inclusion of simulations in this kind of course useful? Why? How would you feel if simulations replaced all labs? Why?

see overleaf Almost all responses positive-3 seems best number Share~collaborate~discuss/complete work

35 yes--know where you are, confusing enough a~readv couldn't get on otherwise, need to do it to understand it 22 no--confusion

11 yes--need experience, testing etc.

Split between NO--need hands on etc. YES but with conditions--live labs, need to do it too too many 18 just right 16 too few 0 33 yes--lets you try different sits and variations, less thance o f messing up Definitely not all--"cheated" "bored" "disappointed" no way you can connect what you are doing to living tis,~ue almost all "notes" almost all "'useful"

Did you write up your labs in report format, or as notes? Will you find this material useful for revision? This year attendance at labs and lectures was lower than usual---can you make any comments on this? start time, frequency, live animals Finally, how confident do you feel that you will do well in the course exam tomorrow? Please circle: Very confid~nt No confidence whatsoever Little confidence Some confidence Confident 1 15 18 1

112

ERICA MCATEER et al.

What aspects of the course interested you most:

I thought the synapse and some lectures were good. Conservation. Animal behaviour. Computer, N c + k + gates--saw instant response. No experimental errors which frequently occurred with the other experiments. Those involving the animals, where direct results could be obtained from them (without excessive cruelty). Labs. The circulatory systems. Easy to relate to. Labs. Circulation. Neurobiology. Cardiac. Renal. Lab work--hands on experience. Very little. If anything, circulatory and nervous systems were OK but have covered them in adequate detail in 1st and 2nd year. Reactions of animals to real situations--e.g, predator attack causing neurone response. The resp. pigments and circulation system I found interesting, because it is such an important part. Actually learning about systems, i.e. circulatory, respiratory, excretory, and how they worked. Respiratory physiology. Working to protect biodiversity. The respiratory and circulatory systems, particularly with regard to the respiratory pigments. Most of it, when explained and demonstrated well. I enjoyed the whole physiology module very much, but felt it was rushed. (Too many labs in one week) Neurotoxins. I knew a lot of animals had such toxins for defence etc. but had no idea how they worked. It was interesting to find out. Membrane and action potentials. Good demonstration, my own interest in neurology. Some of the lectures were quite interesting. N.M. junctions. Good lectures. The behavour s i d e ~ o n ' t detest physiology so much as most. Neuroscience. Interested in toxins and their effect. Learning about different muscle fbres and nerve interactions. All practicals using cells and tissues which came from animals. Usually you are given a bowl of substance x and told this is crab blood or an earthworm giant fibre--actually seeing where the samples came from was much better. World issues--biodiversity. Blood pigments and muscles were quite interesting. Nerves were interesting. What aspects of the course interested you least:

I didn't find the animal experiments interesting. Live animal experiments--unnecessary. The whole course was not very interesting because it was fairly complicated and not explained well in parts. The computer simulations: my general dislike of computers and also the tendency to rush through without making notes. Lectures. Physiology--too complicated, far too much detail and not enough time. Neurones etc. Find it boring. Nerves. The depth of the lectures was often far too shallow and did not challenge the student enough. The whole course was very disjointed and not inter-related enough. Chemistry and chemicals. Neurophysiology. The lectures--were taught in such detail you forget the basics and what is actually important. Respiration. The live animal experiments were totally unnecessary. Most people agreed. The action potentials of nervous systems with Na channels etc. I found very confusing, although the simulation did help clarify. Neurophysiology: complicated interactions are difficult to understand. Neurobiology--flnd it uninteresting. Dissections~ifficult. I enjoyed the whole course but some lectures were a bit intense and rushed. Membranes--was very boring, having been told about them many times before. Lose interest only when not explained well-~especially crab labs, badly explained with very little help. I did not enjoy the waiting involved in a lot of the experiments. O5 transfer at gills, respiration etc. I've covered most of it before to varying degrees so I didn't find interesting doing it again. Respiration rates and energy budgets. Simply not interesting to me. Live dissections. Thought they were at times unnecessary. I didn't particularly enjoy the labs much. Osmoregulation. Arthopods. Computer simulations, I didn't find them very interactive or very memorable. Computer simulations, I didn't feel I learnt anything at all from them. Very confusing. Different students use them at different rates which is awkward. Nerve muscles and impulses. Can't seem to grasp the idea of electricity in things like that.