Evaluation of a physics multimedia resource

Evaluation of a physics multimedia resource

Computers Educ. Vol.24, No. 2, pp. 83-88, 1995 Pergamon EVALUATION 0360-1315(95)00022-4 OF A PHYSICS Copyright© 1995ElsevierScienceLtd Printed in...

447KB Sizes 2 Downloads 61 Views

Computers Educ. Vol.24, No. 2, pp. 83-88, 1995

Pergamon

EVALUATION

0360-1315(95)00022-4

OF A PHYSICS

Copyright© 1995ElsevierScienceLtd Printed in Great Britain. All rights reserved 0360-1315/95$9.50+ 0.00

MULTIMEDIA

RESOURCE

J U S T I N W A T K I N S , 1 J O H N DAVIES, 2 G A Y L E C A L V E R L E Y 3 and TONY CARTWRIGHT3 ' Department of Mechanical Engineering, University of Surrey, Guildford GU2 5XH, England [Fax. 01483 306039; e-mail." [email protected]] 2School of Physics, Queensland University of TechnologyBrisbane 4001 Australia and 3Department of Physics, Keele University, Keele ST5 5BG, England (Received 31 October 1994; accepted 24 February 1995)

A~traet--The objectives of the evaluation described in this paper were to assess the effectiveness of a new computer-based physicseducation project, and to provide guidelines for its improvement. The results of the evaluation highlighted a number of points which are intended to benefit developers of computerbased learning material in general, and those involved in the TI~]'P programme in particular. It was shown that many students were keen to use computers as part of their education, but that the personal interaction between lecturers and students should not be replaced.

INTRODUCTION A formative evaluation, the first phase of a more extensive study, was carried out for the STOMP (Software Teaching of Modular Physics) Project [1], part of the Teaching and Learning Technology Programme (TLTP). The project was funded for three years from 1992 by the Higher Education Funding Councils. The software and academic teaching material is being developed at nine U.K. universities. A further ten universities were involved in the Phase One evaluation. The evaluation was conducted by the Centre for Engineering Educational Technology at the University of Surrey. Whilst the results, comments and observations made in this paper are generally applicable to those constructing and using computer-based learning materials, they are primarily for the benefit of the TLTP and are intended to offer valuable insight to other TLTP projects as they develop their own computer-based materials. A I M S OF STOMP The STOMP project provides a total learning environment using Microcosm [2] running under MS-Windows, to benefit undergraduate physics courses. The aims of the STOMP project are as follows: • To integrate interactive models of laboratory experiments with academic text scripts, data books, reference material, and graphical information. • To provide all the resources required to research a particular subject without having to leave the computer, including the provision of tools to enable the user to take notes, and handle and manipulate experimental data. • To allow the replacement of some or all of a lecture course, enabling the majority of students to operate at their own pace, and leaving the academic staff free to concentrate on those individuals or small groups of students requiring personal assistance. Areas of specialist interest previously omitted by time constraints could also be included in the course. A I M S OF E V A L U A T I O N The evaluation served two main purposes: to guide the development and improvement of the educational resource for its future full implementation, and to indicate the educational suitability of the material. The evaluation is divided into two Phases: 83

84

JUSTIN WATKINSet

al.

Phase One: a formative evaluation in which the STOMP material was used by the students as an adjunct to the lecture-based course. Its primary aim was to provide the development consortium with feedback for incorporation into future material. In this context the Phase One evaluation took place early in the programme so as to guide further development. Although the academic content of the material was limited, there was sufficient to allow both students and tutors to gain an insight into the ultimate content of the material and its intended use. Phase Two: a summative evaluation to be conducted at the end of the TLTP programme. Its aim is to report on the educational merits of the STOMP material as a teaching tool. In this Phase, the material will be used as a replacement to the traditional lecture-based course. This paper deals with the outcomes of Phase One. EVALUATION METHODOLOGY The main vehicle of the evaluation was a series of three questionnaires presented to students before, during and after their exposure to the STOMP material. The evaluation team also spent considerable time observing the students while they worked, and performed structured interviews with individual students during and after their exposure to the STOMP material. Existing evaluation tools such as MEDA [3] provide a good framework on which to base the construction of questionnaires. It is important to note, however, that MEDA provides for comparison between different computer-based resources, and as such it was necessary to extend this to cover the comparison between new educational developments and traditional teaching practices. Alessi and Trollip [4] highlight the need for field-testing of educational software, which is a major component of the STOMP evaluation. They also include as part of evaluation such issues as presentation and user-control. In the case of STOMP, however, much of this work was carried out by the STOMP development consortium. QUESTIONNAIRES The questionnaires were paper-based in order that feedback might be gained in particular from students who were not familiar with computer technology. The first questionnaire concerned students' attitudes and preconceptions to computers and their educational merit, and was completed before the students had seen a presentation of the capabilities of STOMP. This questionnaire asked similar questions of both the traditional education system and the student's expectations of computer-based methods to provide a comparison of the two. The second questionnaire was designed mainly to elicit response on individual sections of the academic content to provide early feedback to the developers of the material. Technical aspects of interest to the consortium that would affect the students' learning were also covered, including the speed of response of the hardware of the computer systems. The third questionnaire concerned students' attitudes after having used the STOMP material, including whether they felt they had benefited from using it, and whether they wanted to see it as part of their course. A code was given to each student so that their questionnaires could be matched up over the term, while preserving anonymity. A separate questionnaire was prepared for academic teaching staff, which included much of the content of the student questionnaires. Additional questions concerned implementation and expected usage of the STOMP material. OBSERVATION Throughout the evaluation, the evaluation team observed students working with the STOMP material, and discussed various aspects of the resource as they arose, on an individual basis. The

Evaluation of a physicsmultimedia resource

85

common difficulties with using the material were noted, as were a number of the students' verbal comments. STRUCTURED INTERVIEWS Structured interviews were carried out at three sites during the Phase One evaluation. Group interviews and discussions were also arranged. IMPLEMENTATION Phase One took place in the summer of 1994 at ten institutions in England and Scotland, involving over 170 students. In Phase One, the STOMP material was used as an adjunct to the academic course, allowing only a subjective analysis to be made. It was not considered appropriate to the project or participating academics for the material to be used for teaching purposes at this stage. However, a positive response of students and staff to the material was needed before STOMP could be considered as a viable alternative to traditional methods of course presentation. Phase One was intended to provide an initial indication of whether or not the new technology was effective for its intended purpose, as well as offering a valuable testing ground for implementation issues. At half of the sites involved in Phase One, the evaluation was performed over a 2-day period. The other sites had a longer exposure to the material, allowing students the opportunity to review that material in their own time if they so desired. The timing of the evaluation in the summer term meant that many students were unable to reply to the second and third questionnaires because of pressures of impending examinations. STUDENTBACKGROUND Approximately 70% o f the students possessed A-level qualifications. The remainder were evenly divided between BTEC, Highers or other qualifications. About 12% of the respondents admitted to being afraid of computers or found them hard to understand. Eleven of the respondents (6%) claimed never to have used a word-processor, and two had never used a computer for any purpose before. Nine out of every ten students had used a computer to play games, and 72% claimed they had done some programming in the past. R E S U L T S OF T H E PHASE ONE E V A L U A T I O N The evaluation of the STOMP project is one of the first major evaluation studies to be carried out under the TLTP Programme. The results presented represent a distillation of the raw data collated through questionnaires, structured interviews and observations. They represent the initial findings and are relevant not just to the STOMP programme, but also to other TLTP projects and to those involved in Computer-Based Learning in general. ATTITUDES TOWARDS COMPUTER-BASED EDUCATION The questionnaires indicated that the ability to work at one's own pace was the most important aspect afforded by Computer-Based Learning. Specific comments made by 3°/,, of the sample highlighted interactivity as being the main advantage of computer-based education. This was further reinforced during observation. A similar proportion of students further commented that access to large amounts of information from the desktop is a bonus of using computers in education. This reflects on the efficiency and management of student learning. From the students' comments regarding expectations of computer-based learning, 18% indicated that they felt educational software was difficult or frustrating to use, and 13% felt that using a computer was a sterile and unsocial way of teaching. This has coioured their attitudes to using a

86

JUSTIN WATKINS et

al.

computer-based resource for learning and additionally indicates that the way in which students interact with software has a marked effect upon their general attitude to computers. Less than 5% of the student sample thought that computers could be fruitfully used to completely replace lectures, the remainder pointing to the social, human and interactive aspects as being the main advantages of the traditional method. Half of the students recognized that, whilst some lecturers may have been poor at lecture presentation, the human contact of the traditional method offered an interactivity and a social warmth which the computer was unable to provide. It was reported by both students and academic staff that it should be possible for computers to assist in presentation of material to the majority of a class, allowing lecturers to devote more of their time to students requiring a need for personal tuition. This corresponds with one of the intended uses of the STOMP material. It was also found that the words "replacing lectures" appears to give students the impression of a learning environment where no academic staff are present at all. Academic staff had some initial reservations about computer-based learning, but were keen to consider new education technology if it were of good quality [5]. Unlike the students, however, the majority of staff involved in the evaluation were keen to use the STOMP software even before it had been seen.

A T T I T U D E S TO U S I N G S O F T W A R E IN A W I N D O W S E N V I R O N M E N T The graphical user interface used for presentation of the STOMP material was MS-Windows. This came under strong criticism for being difficult to use and it was found that about half of the student sample had rarely or never used Windows before being introduced to STOMP. It has been shown, however, that pre-school children can approach levels of expert competence in controlling a windowing environment with little practice [6]. The usability of a G U I hinges primarily on the performance of the computer hardware. Developments at Queensland University of Technology, Brisbane [7] indicate that hardware faults and lack of access to computers hinder the student in using computer-based learning resources more than ease-of-use of software. At one site involved in the STOMP evaluation, heavy multiple user demand slowed access to the STOMP material drastically enough to cause a number of students to walk out on the evaluation.

S T U D E N T R E Q U I R E M E N T S OF C O M P U T E R - B A S E D L E A R N I N G Students were asked to record what they felt was the greatest strength of computer-based learning. This indicates their requirement of good-quality computer-based learning material. The most commonly recorded strength of a computer-based educational resource was that it is able to provide self-paced learning (30%). Ease of use (9 students), interactivity (6 students) and a wealth of information (6 students) were also considered to be strengths. When asked to comment verbally, students further indicated that accessibility of computers in general (not only for computer-based learning) was important, as was ease of use of the software. One of the most common expected weaknesses of computer-based education was the difficulty in using it (19% of student sample). The current lack of availability of computers was seen as a major disadvantage. The unsocial nature of a sterile computer-room was mentioned by 12% where one has no human contact throughout a course. Most students recognized that using educational software would benefit their computer-literacy skills. When asked what additional computer support they would like to have available, a number of students indicated that their current word-processing and data-manipulation software was either dated or insufficient, as were printing facilities. Experience at a number of sites has shown that students are willing to pay for good quality laser-printed output. Computer-aided design facilities were also in popular demand.

Evaluation of a physics multimedia resource

87

Repl~ T~

"-..p, 0

10

20

30

40

50

60

70

80

90

100

Percentage of Students Fig. 1. Where students see computers being used most fruitfully in education.

Figure 1 shows the way in which students perceive computers being used in education. Students saw computers being used most fruitfully in administration (word-processing, data-handling, secretarial work, etc.) and in programming and design, uses which are already common in education. S T U D E N T R E A C T I O N S TO T H E STOMP M A T E R I A L The majority of students were observed to browse initially to see what material was available. Many looked for the visually exciting media first, such as video, then looked through all the pictures. However, when prompted in the second questionnaire to read through a unit script, all students gained a deeper insight into the content of the available material. COMPUTER LITERACY A number of comments written by students indicated a certain ignorance of the current scope of computer usage. Many may be attributed to the individual being uninformed of current developments, but some point towards worrying levels of mis-information in students' perceptions of computer technology, which are high enough to be of concern. Of 170 students, only one was of the firm belief that computer technology had reached the stage where it could completely replace the lecturer. Upon further questioning, he felt that he needed at most one hour of tutor-contact a week, and that all his lectures should be presented in fullmotion video. It was noted that a few students believed that anything presented by a computer can never be wrong, and that computers always produce both correct and precise results. While the same attitude is often taken with textbooks, it appears that some of the less computerliterate students assume the machine to be almost divinely infallible, in particular with reference to the results produced by an interactive simulation. It is essential that developers of educational software in particular understand this, and should make available reference to the inaccuracies inherent in the simulation. It must be emphasized that the above examples were very much the exception. However, developers should be aware of these factors when creating quality educational software. EVALUATION METHOD The structure of the evaluation proved to be effective in collecting attitudes before and after students had used the material. Questionnaires during exposure were found to be less fruitful than had originally been thought. It is recommended that, during exposure, effort should be directed towards structured interviews and observation rather than on questionnaires as a more productive 24-2-D

88

JUSTIN WATKINS et al.

means of gathering learner feedback. Proformas for the guidance of interviews are also recommended in order to elicit a consistency in interview style. COMPUTER-BASED LEARNING MATERIAL--THE

NEED FOR TRAINING

The need to train students to navigate through the content of computer-based learning material (such as STOMP) was highlighted before the evaluation began. Within STOMP, a training resource was implemented just prior to the release of the software. The majority of the training resource was screen based, with two short paper documents written to assist students in learning windows skills and to use the navigation tools available. An introductory timetabled training session was also offered. It was found that most students ignored the paper-based resources and went directly to using the computer. M a n y encountered subsequent difficulties in performing certain actions, which had been covered in the paper-based resource. Students were also shown how to access the on-line training material, but most preferred to ask questions. OBSERVATIONS AND CONCLUSIONS The findings of the evaluation indicated levels of windows-literacy amongst new physics undergraduates to be lower than initially expected by the evaluation team. Further work will be necessary to identify the true computer-literacy of first-year undergraduates, taking into consideration that a student who has never used a windowing environment is not necessarily computerilliterate. On the basis of the Phase One evaluation, the following recommendations were made to the development team. While these recommendations are specific to the STOMP programme, they should apply generally to those developing computer-based learning materials. • Ensure that methods of progressing through the material are unambiguously presented. • Employ constant window positioning in order to assist the user in managing the windows interface. • Ensure that the balance of media is biased to minimize text. • Text that does appear should be divided into smaller modules with frequent links to activities and other media. • Input and output parameters of simulated experiments should be clear with adequate help provided. • Ensure that navigational aids are both logical and intuitive. M a n y of these recommendations have STOMP material. Their effects will be part Finally, the underlying message from the break the mould of the traditional lecture, remains a crucial factor to learning.

already been incorporated into later releases of the of the analysis in Phase Two. students was that, whilst computer-based learning can the personal interaction between student and teacher

REFERENCES 1. 2. 3. 4.

Bacon R. A., Phys. Educ. 28, 97 (1993). Hall W., IEEE Multimedia 1 No. 1, 60 (1994). Machell J. and Saunders M., MEDA an evaluation tool for training software. CSET, University of Lancaster (1988). Alessi S. M. and Trollip S. R., Computer-based Instruction Methods and Development, 2nd edition. Prentice-Hall, Englewood Cliffs, N.J. (1991). 5. Laurillard D., Swift B. and Darby J., The CTISS File 14, 54 (1992). 6. Crook C., Computers Educ. 19, 199 (1992). 7. Davies J. A., A pilot study into the effectiveness of a new teaching method to instruct computerised data-acquisition instrumentation. Proceedings of "OzCUPE" (Australian Conference on Computers in University Physics Education), 1 79 University of Sydney (1993).