EFFECTIVENESS OF HANDS-ON INSTRUCTION OF ELECTRONIC RESOURCES
BARBARABREN University of Wisconsin- Whitewater BETH HILLEMANN Macalester College VICTORIA TOPP University of Wisconsin- Whitewater ABSTRACT: This research focuses on the effectiveness of using hands-on instruction in a multiple-workstarion instruction laboratory. Four sections of second semester Freshman En&h classes, creating a sample size of 81 students, were divided accordinS to instruction method and were tested on retention after the session. Students receiving guided harm&on instruction retained mare information than those experiencing a lecturekiemonstration method. There were no dtrerences related to gender. The results of this study reinforce current views on the effectiveness of active learning and lend support for the continued akvebpment of multiple-workstation classrooms.
INTRODUCTION Everyone involved in library instruction is aware of the rapid shift to electrouic resources in the past ten years. We have seen an ever-increa&g variety of databases, interfaces, and new technologies. Teaching students to use very different databases effectively and to choose the best resources for their needs is a continuing challenge. Libraryinstructorshave to: Barbara Bren, University of Wisconsin- Whitewater, P.O. Box 900. Whitewater, Wisconsin531%ORXI. Beth Hiltemann, Macalcster College, 1600 Grand Ave., St. Paul, MN 55105. Victoria Topp, Untversity of Wisconsin-Mitewater,P.O. Box 900, Whitewater. Wisconsin 531~ORUI.
*Direct ail corrqondence
RESEARCH STRATEGIES, Vohme 16, Number 1, psgea 41.51. Copyright 0 1998by JAI Frea Inc. Au righta ofrqmoduction In my foIm reserved. ISSN: 0734-3310
42
RESEARCH STRATEGIES
Vol. 16/No. l/1998
developed several strategies for teaching students the skills they need, including online help screens, Web guides, printed guides, and both formal and informal instruction. A popular model for formal instruction in academic libraries is a lecture/demonstration session geared to the needs of a particular class. Typically, library instructors have one computer for demonstration purposes in these instruction sessions. However, in the past few years many academic libraries have invested in multiple-workstation classrooms, or library instructors have gained access to computer labs on campus for hands-on instruction. The assumption has been that hands-on instruction on computers is the most effective method for teaching students the concepts and skills they need to conduct research. This study compared university students’ retention of concepts and skills taught by two different methods of instruction: lecture/demonstration and guided hands-on instruction. A side area of interest for the study was whether gender influenced the retention of concepts and skills in a computerized environment. The University of Wisconsin-Whitewater has a student population of approximately 10,000 and offers 43 undergraduate and 13 graduate degree programs. Library staff and administrators have made a commitment to electronic resources, offering a variety of CD-ROM and online databases. The University Library has had a multiple-workstation laboratory for library instruction since 1989. The instruction lab consists of 20 PCs, with an additional instructor’s workstation at the front of the room and projection equipment for demonstration purposes. A typical instruction session at UW-Whitewater uses the method of guided hands-on exercises to teach students techniques and concepts needed for effective research using electronic databases. The instructor leads the class through a preselected set of searches which illustrate a variety of searching techniques, screen displays, and choices in database selection. We decided to test the validity of the assumption that hands-on instruction is more effective than lecture/demonstration instruction for teaching research skills and concepts in an electronic environment. A comparison of student retention of the information presented in an instruction session would indicate a difference in effectiveness of instruction. Therefore, a post-test was administered to student groups receiving instruction through different methods to test for retention of the skills and concepts taught in the session. The test instrument was pretested during the fall semester of 1994, and the full experiment was conducted in the spring of 1995.
LITERATURE REVIEW There is general acceptance in the education literature that active learning techniques are very effective means for teaching skills and concepts. This assumption has been carried into library instruction as well (Drueke, 1992).’ For example, Beth L. Mark and Trudi E. Jacobson (1995) emphasized the usefulness of active learning techniques for reducing library anxiety in an electronic environment. Paula N. Wamken and Victoria L. Youn (1991) looked to the training literature for ways to make library instruction more active. s In an electronic environment, active learning with multiple-workstations available for students is considered ideal (Bell, 1990).3 Caroline Rowe’s (1994) survey of academic libraries in Florida indicated that all were using equipment to project computer screens for demonstration purposes, while acknowledging that a hands-on experience would be
more effective. At that point, only two of the eleven Florida libraries had access to a multiple-workstation classroom.4 More recently several articles (Glogoff, 1995; Vasi and Laguardia, 1994, Wiggens and Howard, 1993) have appeared describing the process of creating such a classroom,5 and the listserv for library ins~ction, B&L, has had many posts on gaining funding for and equipping m~tiple-wonton ~lassrooms.~ The majority of instruction sessions, however, are still lecture/demonstration, and the literature focuses on evaluating this technique. Various methods of evaluation have been used, including pre- and post-tests and exercises to be completed by students after the session. After reviewing examples of questions from several studies, we decided to concentrate on questions which would address banding of content rather than location of material or attitudes toward the library (Franklin and Toifel, 1994, Fry and Kaplowitz, 1988; Kaplowitz, 1986; Tiefel, 1989).’ While several studies have shown improvement in student performance after instruction sessions, fewer studies have compared results with methods of instruction, and very few have made such comparisons in a lab en&o-t. Rebecca Bostian and Anne Robbins (1990) compared a range of instruction techniques for CD-ROM indexes, from point-of-use ~s~ctio~ with ven~r-supple materials to lec~~~~~~tion.s Dorothy Davis (1993) also evaluated different methods of instruction, including lectt&demonstmtion without computer projection, computer-assisted instruction, and video instruction.’ In a study conducted by Linda G. Ackerson and Virginia E. Young (1994). the control group experienced a fifty-minute lecture with overheads emphasizing the research process. The experimental group had an additional tbrce sessions in a computer laboratory with demonstrations bmadcast from the instructor’s workstation. Although hands-on instruction did not take place during the class period, students in the experimental group were encouraged to use the laboratory at a later time. lo Joan M. Cherry and Marshall Clinton ( 199 1) compared lecture/demonstration with an interactive computer tutorial.* * One study using an instruction laboratory at Brigham Young University did compare results from instruction on the NGTIS catalog and Silver Platter’s interface for ERIC, using either a hands-on or a l~t~~rnonstration method. The results of the comparison showed little difference in learning between the groups for the catalo but hands-on did prove slightly more effective in the ERIC instruction (Wiggens, 1994). 1p Our literature search pointed out the variety of methods employed to help users learn to use library resources; we were, however, surprised to fmd so little which included instruction in a multiple-workstation classroom using hands-on activities. In addition to our question about tbe effectiveness of faction methods, we wonder& if gender would affect retention of instruction in an electronic environment. Concern has been expressed in the popular media, as well as in education and library literature, that women am less comfortable than men in computer environments. Studies conducted on this question, however, have yielded mixed results. For example, Katherine Canada and Frank Brusca (1991) found support for gender differences in computer use, as did Janice T. McDonald ( 1993).13 However, in studies on compu~r anxiety, other authors have found no correlation between gender and anxiety related to computer use. l4
RESEARCH QUESTIONS Is there a difference in the effectiveness of library instruction in electronic resources based on the method of ins~ction? Is there a relationship between gender and the n%entio~ of instruction on electronic resources?
44
RESEARCH
STRATEGIES
Vol. 18lNo. 111998
First year English courses with research paper requirements usually come to the University Library for instruction in the use of the online catalog and electronic periodical indexes. We chose to test two pairs of first year, second semester English class sections taught by two different professors. Approximately 25 students were enrolled in each class section; 86 students participated in this study, 55% of them female. However, tive of the students were absent from the library instruction sessions and their scores were ignored in the analysis of the post-test. As we expected, the students in each section were similar in demographic characteristics. Responses to questions on the test instrument revealed that 41 percent of the students were 19-20 years of age, with only four students (less than 5%) older than 20. None of the students was older than 24. Most of the students (87%) were in their first year of college, although two were sophomores and three were juniors. We also expected that the class sections would be similar in their prior exposure to electronic research tools. Since this was a second semester course, it was not surprising that 54 students (62.8%) had used the University’s online catalog prior to the instruction session. Fewer students (41.9%) had used online catalogs in their high school libraries or in a public library (58.1%). Previous experience with electronic periodical indexes was less common, with 30.2% having used one at the University Library, 30.2% in a high school library, and 29.1% in a public library. The effects of teaching method were isolated by controlling other factors as much as possible, including class time for instruction, course assignments, and library instructor. Library instruction sessions were monitored to ensure that the content of questions on the test instrument was covered in each session. One section in each pair received library instruction through lecture/demonstration, while students in the other section were guided by the instructor as they entered searches themselves. A total of 44 students (51%) received instruction through lecture/demonstration, while 37 students (43%) received guided hands-on instruction. A 32-item multiple-choice test was administered to each section during the next class meeting following library instruction. The decision not to give a pm-test was based on concerns about sensitization. Giving a pre-test so close to the actual instruction might focus the students’ attention on particular aspects of the presentation, thus altering the results in the post-test. The questions were developed specifically to reflect the information delivered in the instruction session. Twenty of the test questions covered appropriate uses of the electronic resources used in the instruction. Questions in this section included when to use the online catalog vs. an online periodical index, specific commands for searching and displaying information, and interpretation of screen displays, e.g. periodical holdings information. Four possible answers were listed for each test question, with only one correct answer. The remaining twelve questions on the test requested demographic information and an indication of previous experience with online catalogs and periodical indexes. A copy of the test instrument is attached. The answer sheets were computer-scored and frequency counts were run for all content questions. Because of the relatively small sample, r-tests were used to compare the mean scores of hands-on vs. lecture/demonstration groups to determine whether any difference in their scores was statistically significant.
Effecriveness of Hands-On lnstmction of Electtvnk Resources
45
ROSUltS
Students in the sections that received guided hands-on instruction performed better on the post-test than students who were taught using 1ectureAlemonstration. The average score of the 37 hands-on students was 12.7 correct responses out of twenty (SD = 2.7), while the 44 lecture/demonstration students averaged 11.l correct responses out of twenty (SD = 2.8). Although the difference in the raw scores seems small, a f-test showed that this difference was statistically significant (c = 2.62), (F = 8.0, p = .Ol). We looked at the scores with a view to gender differences and found no distinction in retention of information based on gender. Moreover, there was no correlation between previous use of library electronic resources and gender. For all students, previous experience with online catalogs or electronic periodical indexes was not a significant factor in their retention scores, except for those students who had used the University’s online catalog or periodical indexes.
CONCLUSION This study examined whether students who were given the opportunity to input pre-selected searches themselves were more likely to retain the information presented in a library instruction session than those who were taught through lecture/demonstration. Students were tested on their knowledge of appropriate database selection, use of system commands, and interpretation of screen displays. The results indicate that instruction using a guided hands-on method did increase students’ retention of information. There was no significant difference in retention of information between men and women in the class sections. These results, from a small sample of students, suggest the value of hands-on instruction. Repeating the experiment with a larger sample, over several consecutive years, would demonstrate the validity of our results. Future researchers may be able to expand on these results by evaluating students’ knowledge both before and after hands-on instruction. If the pre-test and post-test are given sufficiently far apart, sensitization concerns are diminished. We evaluated the retention of very basic information from library instruction sessions; further research is needed to determine the effectiveness of guided hands-on instruction for more sophisticated concepts. Additionally, we only examined one form of hands-on instruction, but different methods of hands-on instruction should be evaluated to determine which methods are most effective in a multiple-workstation library instruction laboratory.
APPENDIX English 102 Library Instruction Assessment (l/95) These questions reflect the content of the library instruction your class received. If you did not attend this session, add “absent” to the NAME box on the answer sheet. Answer the questions anyway. Do NOT put your name on the answer sheet; your responses will be anonymous and ungraded. Please select ONLY ONE answer per question. Use PENCIL to mark answers on the answer sheet; do NOT mark on the question pages.
Vol. 16/No. l/1996
RESEARCH STRATEGIES
46
1. To find books written by Sigmund Freud that are available in Andersen Library, which one of the following should you use? a. Encyclopedia Britannica b. UWW (online catalog) C. card catalog d. UMIl3 (Periodical Abstracts) 2. To find magazine articles that are about Freud’s theories, which one of the following should you use? a. Encyclopedia Britannica b. UWW (online catalog) C.
cardcatalog
d. UMIB (Periodical Abstracts) 3. You are looking for a book that has the title The Old Man and the Sea. Which of the following searches should you enter? a. the old man and the sea b. t the old man and the sea C. tmanandsea d. told man and the sea 4. If you do not know an exact, official subject heading for your research topic, what type of search should you use? a. t (title) b. s (subject) c. k (keyword) d. a (author) 5. The only punctuation marks you must use in an online catalog search are two dashes between subject headings and their subdivisions, e.g., s culture-study and-teaching a. TRUE b. FALSE ***FOR QUESTIONS 6-7*** You are looking at the following display for a title in the online catalog: UW-Whitewater Catalog BOOK - 1 of 9 Entries Found -~___--__----__---___---__-~~~__---~-~-----~~-~---~---~~~---~~-TITLE:
The Amish
PUBLISHED:
and the state
Baltimore
: Johns
/ edited
Hopkins
by Donald
University
B. Kraybill.
Press,
c1993.
Amish--History--20th Century. SUBJECTS: ___----_-_-__--____------page 1 of 2 ~~-----~----~__---~_---~~_ STArt HELp OTHer NEXT
over options COMMAND:
LONg view INDex
eF6>
FORward page NEXt record
Et7eWiveness of Hands-On Instruction of Electrvnic Resomes
47
How would you display the book’s location and call number? a. EnterOTH b. Enter NEX (or N) c. just press the key 8. You want to switch from the online catalog to one of the online periodical indexes, like UMIB (Periodical Abstracts). What command should you give, especially if you want to re-run a previous keyword search? a. BAC (or B) b. IND (or I) c. REV (orR) d. CHO 9. What command should you enter to see a list of the last ten searches you have done on the online catalog? a. VIE(or V) b. IND (or I) c. REV (orR) d. CHO 10. If you are having trouble getting useful titles on the online catalog, where would you be most likely to seek assistance? PICK THE ONE PLACE YOU WOULD MOST LIKELY GO FOR HELP. a. Printed how-to guides b. Reference librarians c. Online “help” or “explain” screens d. Friends or classmates 6.
***FOR QUESTIONS II-15***
You are looking at this record for a periodical article at a
computer workstation: Periodical Abstracts Record -- 1 of 9 Entries Found -_--_--_-__-__--_--_--~--~--~-~~--~--~-----~--~--~--~--~--~--~-AUTHOR(S) :
Stearns, Scott
TITLE:
An uneasy peace.
SOURCE:
Africa Report Jan-Feb 1994, v39, nl, p32(4)
SPECIAL FEATURES: (photograph) A ceasefire has ended three years of fighting between government troops and predominantly... -_--------_--_--_--_------ page 1 of 2 -_--_-----_--_--_--_------
ABSTRACT:
48
Vol. 18/No. l/1998
RESEARCH STRATEGIES
STArt over
HOLdings
MARk
FORward page
HELp
LONg view
CHOose
NEXt page
OTHer options
cF5> PREvious page
INDex
Held by library--type HOL for holdings information. NEXT COMMAND:
11.
12.
13.
14.
15.
Does Andersen Library have the periodical that contains this article? a. YES b. NO c. This display does not provide you with that information. Suppose Andersen Library DOES have the periodical. Which of the following actions would tell you where in the library this article may be found? a. Go to the online catalog and enter the search: t uneasy peace b. On the displayed screen, enter: HOL (or HO) c. On the displayed screen, enter: FOR (or F) d. On the displayed screen, enter: LON (or L) The title of the periodical that contains the article is: a. An Uneasy Peace b. Africa Report c. Periodical Abstracts d. This display does not give you that information. One of the following searches would cause the computer to find this record. Which one? a. a Scott steams b. k ceasefire and peace c. t an uneasy peace d. s africa report This display gives detail for the first title from a list of nine articles retrieved for your search. Which of the following commands should you enter to see the list of all nine articles? a. IND (or I) b. CHO c. VIE (or V) d. FOR (or F)
***FOR QUESTION IV** You are looking at the following computer display about the availability of a periodical in Andersen Library: Title: Newsweek ________-______________ Location 1 _--_--_--_--_-__-__-_ LOCATION:
Current Periodicals
CALL NUMBER:
Shelved by Title (A-Z)
CURRENT ISSUES:v.124:no.3
(1994:July 18)
EHetivenem of Hands-On Instruction of Electnmk: Resowes
49
v.124:no.Z
f1994:July 11)
v.124:no.l
(1994:July 041
Location
_I__-___-___-__--_---~~
2
______________-_~_-_~
LOCATION:
Bound Periodicals
CALL NUMBER:
Shelved by Title (A-Z)
STATUS:
Check Shelf
v.7 (1936)-v.28 (1946) LIBRARY HAS: __-______---__-__--_--- Location 3 _----__-____________~~-~ LOCATION:
Microform Room, Fiche
CALL NUMBER:
Shelved by Title (A-Z)
STATUS:
Check Shelf
LIBRARY HAS:
v.29 (1947)-v.122 (1993)
STArt over
VIEW record
MARk
HELP
LONg record
CHOose
BACk page
OTHer option
INDex
cF6>
NEXt record
FoRward page
NEXT COMMAND:
16. You want to read an article in volmne 121, November 1991, of Newsweek. Is this issue available in Andersen Library? a. NO b. YES, in Current Periodicals c. YES, in Bound Periodicals d. YES, in the Periodicals Microform Room (on fiche) 17. You want to read the following article for your paper. How can you learn if the periodicaE is available in Andersen Library? Swenson, A. (1992). The food crisis. Agriculture Research, 32(2), 675-684. a. Do an author search in the online catalog ~) b. Do a title search for “agriculture research” in the online catalog (UWW) c. Do a title search for “food crisis” in the online catalog (UWW) d. Do a title search for “agriculture research” in Perioded Abstracts (UMIB) 18. If the periodical that contains an article you need is NOT available in Andersen Library, how can you get a copy of it? a. Fill out a “SEARCH” request at the Circulation Desk. b. Fill out a “HOLD” request at the Circulation Desk. c. Fill out an InterLibrary Loan request form at either the Circulation Desk or the Reference Desk. d. Search for the article on Periodical Abstracts.
RESEARCH STRATEGIES
50
19.
20.
21.
Vol. 18/No. 111998
Which of the following searches should retrieve the highest number of titles? a. t animal b. k animalsu. s animal C. d. kanimal 20. Which of the following searches should retrieve the sumBest number of titles? a. k animal or animals or wildlife b. k animal or wildlife c. k animal and endangered d. k (animal or animals or wildlife) and endangered You have used the following search: k child and play? Which of the below titles would NOT be retrieved for you? a. How to Play with a Child. b. Playing Childish Games to WIN! c. Safe Playthings for Your Child. d. The Child in the Jungle, a Play in Three Acts.
And now for a little information about you: 22. 23.
Gender Age
24.
Class rank
a. FEMALE a. 17-18 b. 19-20 c. 21-22 d. 23-24 e. 25+ a.FROSH b. SOPH
b. MALE
c.IR d. SR
e. UG SPECIAL
25-31*** a. YES b. NO Are you a transfer student? Did you use the online catalog in this library BEFORE the library instruction session? 27. Did you use an online (not card) catalog in your high school library? 28. Have you used an online (not card) catalog in a public library? 29. Did you use online or CD-ROM periodical indexes in this library BEFORE the library instruction? 30. Did you use online or CD-ROM periodical indexes in your high school library? 3 1. Have you used online or CD-ROM periodical indexes in a public library? 32. How would you rate your level of “comfort” with computers? Use an a-e scale, where: a = I’M VERY UNCOMFORTABLE/I HAVE LOTS OF TROUBLE USING THEM b = I’M SOMEWHAT UNCOMFORTABLE d = I’M SOMEWHAT COMFORTABLE e =I’M VERY COMFORTABLE/ I CAN USUALLY FIGURE OUT HOW THEY WORK ***FOR QUESTIONS
25. 26.
Effectiveness of Hands-On lnstnxtion of Electtvnic Resoums
51
NOTES AND REFERENCES 1. Jeanette Drueke, “Active Learning in the University Library Instruction Classroom,” Research Strategies 10 (Spring 1992): 77-83. 2. Beth L. Mark and Trudi E. Jacobson, ‘“leaching Anxious Students Skills for the Electronic Library,” College Teaching 43 (Winter 1995): 28-31; Paula N. Wamken and Victoria L. Young, “Application of Training Principles and Techniques for Successful Library Instruction,” RSR Reference Services Review 19 (Winter 1991): 91-96. 3. Stephen J. Bell, “Using the Live Demo in Online Instruction,” Online 14 (May 1990): 38-42; Susan Carpenter, ‘Sidebar: Hands-on Instruction,” Library Hi-Tech 12 (1994): 59. 4. Caroline Rowe, “Modem Library Instruction: Levels, Media, Trends, and Problems,” Research Strategies 12 (Spring 1994): 4- 17. 5. Stuart Glogoff, “Library Instruction in the Electronic Library: The University of Arizona’s Electronic Library Education Centers,” RSR Reference Services Review 23 (Summer 1995): 7-12; John Vasi and Cheryl Laguardia, “Creating a Library Electronic Classroom,” Online 18 (September/ October 1994): 75-84; Marvin E. Wiggens and Donald H. Howard, ‘Developing Support Facilities for BYU’s Bibliographic Instruction Program,*’ Journal ofAcademic Librarianship 19 (July 1993): 144-148. 6. [email protected]’ON.EDU 7. Joan Kaplowitz, “A Pre- and Post-Test Evaluation of the English 3- Library Instruction Program at UCLA,” Research Strategies 4 (Winter 1986): 1 1- 17; Virginia Tiefel, “Evaluating a Library User Education Program: A Decade of Experience,” College and Research Libraries 50 (March 1989): 249-259; Godfrey Franklin and Ronald C. Toifel, “The Effects of BI on Library Knowledge and Skills Among Education Students,” Research Strategies 12 (Fall 1994): 224-237; Thomas K. Fry and Joan Kaplowitz, ‘The English 3 Library Instruction Program at UCLA: A Follow- up Study,” Research Strategies 6 ( 1988): 1OO-108. 8. Rebecca Bostian and Anne Robbins, “Effective Instruction for Searching CD-ROM Indexes,” Laserdish Professional 3 (January 1990): 14- 17. 9. Dorothy Davis, “A Comparison of Bibliographic Instruction Methods on CD-ROM Databases,” Research Strategies 11 (Summer 1993): 156 163. 10. Linda G. Ackerson and Virginia E. Young, “Evaluating the Impact of Library Instruction Methods on the Quality of Student Research,” Research Strategies 12 (Summer 1994): 132-144. 11. Joan M. Cherry and Marshall Clinton, “An Experimental Investigation of Two Types of Instruction for OPAC Users,” The Canadian Joumul of Information Sciencfievue Cunadienne des Sciences de l’lnformation 16 (December 1991): 2-22. 12. Marvin Wiggens, “Hands-on Instruction in an Electronic Classroom,” (ERIC document ED 369391, 1994). 13. Katherine Canada and Frank Brusca, ‘The Technological Gender Gap: Evidence and Recommendations for Educators and Computer-Based Instruction Designers,” Educational TechnoZogy Research and Development 39 (1991): 43-51; Janice T. McDonald, “Gender Differences in Computer Use,” Ohio Media Spectrum 45 (Fall 1993): 19-20,26-27. 14. Jennifer L. Dyck and Janan Al-Awar Smither, “Age Differences in Computer Anxiety: The Role of Computer Experience, Gender and Education,” Journal for Educational Computing Research 10 (1994): 239- 248; John Todman and Elizabeth Monaghan, “Qualitative Differences in Computer Anxiety and Students’ Use of Computers: A Path Model,” Computers in Human Behavior 10 (1994): 529-539.