The Journal of Academic Librarianship 42 (2016) 655–663
Contents lists available at ScienceDirect
The Journal of Academic Librarianship
Assessing Graduate Level Information Literacy Instruction With Critical Incident Questionnaires Laura Saunders a,⁎, Jenny Severyn a, Shanti Freundlich b, Vivienne Piroli c, Jeremy Shaw-Munderback c a b c
Simmons College School of Library and Information Science, 300 The Fenway, Boston, MA 02115, United States Massachusetts College of Pharmacy Libraries 179 Longwood Ave, Boston, MA 02115, United States Simmons College Beatley Library, 300 The Fenway, Boston, MA 02115, United States
a r t i c l e
i n f o
Article history: Received 13 July 2016 Accepted 10 August 2016 Available online 17 August 2016
Information literacy, defined as “a set of integrated abilities encompassing the reflective discovery of information, the understanding of how information is produced and valued, and the use of information in creating new knowledge and participating ethically in communities of learning”,1 is widely recognized as crucial to college students' learning and success both in school and ultimately as they develop life skills.2 Traditionally promoted by librarians, information literacy competencies have been endorsed by higher education and research organizations, including regional and discipline/professional accreditation organizations.3 Regional accreditation organizations in particular assert an expectation that institutions should address information literacy learning outcomes at all levels of education.4 However, the majority of information literacy instruction and assessment appears to be taking place at the undergraduate level, often within a general education curriculum. Research on instruction and assessment for information literacy at the graduate level is less common. It is unclear why there is less attention to information literacy learning outcomes at the graduate level in the LIS literature. It is possible that some instructors believe that graduate students come to the program
⁎ Corresponding author. E-mail addresses:
[email protected] (L. Saunders),
[email protected] (J. Severyn),
[email protected] (S. Freundlich),
[email protected] (V. Piroli),
[email protected] (J. Shaw-Munderback). 1 ACRL Board, “Framework for Information Literacy for Higher Education,” Association of College and Research Libraries, February 2, 2015, http://www.ala.org/acrl/standards/ ilframework. 2 Alison J. Head, “Staying Smart: how Today's Graduates Continue to Learn Once They Complete College” (white paper, Seattle, WA: Project Information Literacy, February 5, 2016), http://projectinfolit.org/images/pdfs/2016_lifelonglearning_fullreport.pdf. 3 Laura Saunders, Information Literacy as a Student Learning Outcome: The Perspective of Institutional Accreditation (Santa Barbara, CA: Libraries Unlimited, 2011). 4 Saunders, Information Literacy as a Student Learning Outcome, 2011.
http://dx.doi.org/10.1016/j.acalib.2016.08.008 0099-1333/© 2016 Elsevier Inc. All rights reserved.
having learned the necessary competencies at the undergraduate level. But even if that were the case, graduate students are generally dealing with more specialized information which requires more sophisticated skills to access, evaluate, and use, information which in turn implies that they will need instruction and support to develop those more sophisticated skills. In its Degree Qualification Profile, the Lumina Foundation defines the different levels of outcomes that might be expected of students at the Associate, Bachelor, and Master's levels.5 Using the phrase “use of information sources,” rather than information literacy, the Lumina Foundation contends that “there is no learning without information, and students must learn how to find, organize and evaluate information in order to work with it and perhaps contribute to it. At each degree level, these tasks become more complicated — by language, by media, by ambiguity and contradictions.”6 The implication is that the competencies associated with information literacy are not a discreet set of skills, but represent ways of knowing and interacting with information that need to be built on in developmental and sequential ways, so that undergraduates would be expected to demonstrate a certain set of abilities “less sophisticated than the skills required of a student in a highly specialized graduate program.”7 In order to ensure that graduate students are achieving this level of information literacy, academic librarians should assess learning in graduate library instruction sessions and use the results to reflect on and improve that instruction. This study describes efforts to engage in such assessment of graduate level information literacy instruction for social work and physical therapy students using a Critical Incident Questionnaire (CIQ), a brief self-reflection in which students report on where learning occurred, and where they still have questions. The results of this study will be of interest to academic librarians, as well as higher education faculty and administrators who are interested in how well library instruction helps institutions to achieve learning outcomes related to information literacy and critical thinking.
5 Degree Qualifications Profile (Champaign, IL: Lumina Foundation, 2011), http:// degreeprofile.org/read-the-dqp/the-degree-qualifications-profile/intellectual-skills/. 6 Degree Qualifications Profile (Champaign, IL: Lumina Foundation, 2011), http:// degreeprofile.org/read-the-dqp/the-degree-qualifications-profile/intellectual-skills/, 16. 7 Middle States Commission on Higher Education, Developing Research and Communication Skills: Guidelines for Information Literacy in the Curriculum (Philadelphia, PA: Middle States Commission, 2003), 10.
656
L. Saunders et al. / The Journal of Academic Librarianship 42 (2016) 655–663
LITERATURE REVIEW Faculty members and even librarians often seem to assume that graduate students enter programs already having attained the information literacy skills necessary for the research and analysis required of their programs, and do not necessarily engage in formal instruction for graduate students at the same rate as they do for undergraduates. Although evidence of collaboration between faculty and librarians for graduate-level instruction exists, according to one study, the majority of librarian interaction with graduate students seems to take place outside of the classroom through research consultations or group meetings and workshops, with 42% of librarians spending N 1h per week with graduate students.8 While a majority of librarians in the study offered classes on various topics including literature reviews, and use of specific resources including citation software, less than 1% reported engaging in course-integrated instruction. Information literacy instruction targeted to doctoral students is even less common. Margaret Bausman and Sarah Laleman Ward specifically note a dearth of literature related to Master's level social work students, and suggest that “social work has fallen behind in the integration of information literacy into its formal curriculum.”9 The research that does exist suggests that graduate students do not always demonstrate information literacy competencies. A comparison of information literacy skills of entering first year students with new graduate students showed that while the graduate students generally scored better than their undergraduate counterparts, many still lacked basic information literacy competencies including understanding plagiarism, being able to access full-text documents, and effectively narrowing searches.10 According to one survey, over one-third of graduate students had difficulty finding articles related to their research topics and formatting citations, while 30% lacked confidence in their ability to use a database. Many students were also unfamiliar with search strategies including Boolean operators, truncation, and subject or thesaurus searching.11 Some of the most common problems experienced by graduate students include choosing appropriate keywords and refining searches, as well as sorting through the sometimes overwhelming amount of information retrieved.12 Even at the doctoral level, students often have trouble searching subject databases, but they recognize that advanced search strategies can be useful and save them time.13 Amy Catalano's meta-analysis of the literature on graduate students' search behavior confirmed a number of trends.14 Graduate students often begin their searches by consulting with their instructors, and then turn to the web. Catalano notes that although these students recognize that web resources can be unreliable, they tend to rely on them 8 Andrea Baruzzi and Therese Calcagno, “Academic Librarians and Graduate Students: An Exploratory Study,” Portal: Libraries and the Academy 15, no. 3 (2015): 393–407, doi: 10.1353/pla.2015.0034. 9 Margaret Bausman and Sarah Laleman Ward, “Library Awareness and Use among Graduate Social Work Students: An Assessment and Action Research Project,” Behavioral & Social Sciences Librarian 34, no. 1 (2015): 19, doi: 10.1080/01639269.2015.1003498. 10 Kate Conway, “How Prepared are Students for Postgraduate Study? A Comparison of the Information Literacy Skills of Commencing Undergraduates and Postgraduate Information Studies Students at Curtin University,” Australian Academic & Research Libraries 42, no. 2 (2011): 121–135, http://ezproxy.simmons.edu:2048/login?url=https://search. ebscohost.com/login.aspx?direct=true&db=lls&AN=72101036&site=ehostlive&scope=site. 11 Amy Jo Catalano, “Using ACRL Standards to Assess the Information Literacy of Graduate Students in an Education Program,” Evidence Based Library & Information Practice 5, no. 4, (2010): 21–38, http://ezproxy.simmons.edu:2048/login?url=https://search.ebscohost. com/login.aspx?direct=true&db=lls&AN=74294588&site=ehost-live&scope=site. 12 Kristin Hoffman, Fred Antwi-Nsiah, Vivian Feng, and Meagan Stanley, “Library Research Skills: A Needs Assessment for Graduate Student Workshops,” Issues in Science & Technology Librarianship 53 (2008): ejournal, doi: 10.5062/F48P5XFC. 13 Monica Vezzosi, “Doctoral Students' Information Behavior: An Exploratory Study at the University of Parma (Italy),” New Library World 110, no. 1/2 (2008): 66–80, doi: 10.1108/03074800910928595. 14 Amy Catalano, “Patterns of Graduate Students' Information Seeking Behavior: A MetaSynthesis of the Literature,” The Journal of Documentation 69, no. 2 (2013): 243–274, doi: 10.1108/00220411311300066.
for background information. They also report preferring library databases, but depend heavily on remote, full-text access. She also identified some disciplinary differences in search behaviors, with humanities students tending to “seek” more than search, and relying more heavily on print resources, while science students generally prefer online sources. On the whole, graduate students tend to be confident in their information literacy abilities, although that confidence is not always demonstrated in their practice. Interestingly, Mahmood Khosrowjerdi and Mohammad Iranshahi found a statistically significant relationship between past experience in source usage and graduate students' judgment of source relevance as well as the amount of effort they were willing to employ to search for information.15 Their findings suggest that instruction and integrated use of resources in the curriculum is important for development of information literacy skills. INFORMATION LITERACY ASSESSMENT Kate Conway notes “an absence, at the postgraduate level, of information literacy skills assessment.”16 Much of the literature on information literacy at the graduate level focuses on information behavior rather than instruction or assessment.17 Further, the majority of assessment that does exist relies largely on tests focusing on basic skills,18 including standardized tests such as the Standardized Assessment of Information Literacy Skills (SAILS), iSkills, and the Council of Australian University Librarians Information Skills Survey (CAUL ISS).19 Indeed, ACRL's National Information Literacy report identified surveys and tests as the most widely-used assessment tools.20 Sonia Špiranec and Mihaela Banek Zorica critique the inadequacy of the majority of assessment approaches and call for a “reconceptualization of the current tool-oriented models which insist on the existence of a linear sequence of steps.”21 Megan Oakleaf warns against over-reliance on tests as often oversimplifying concepts and describes them as “indirect assessments that fail to measure higher order thinking skills.”22 She draws on the assessment for learning theory promoted by Grant Wiggins23 that authentic assessments are openended and contextual, and can lead to deeper learning. She further advocates for performance assessments that require students to demonstrate knowledge through activities and projects, as well as assessments that involve critical reflection on learning. A few studies have implemented assessments other than surveys. Instructors at Queensland University in Australia used pre- and postinstruction questionnaires with both open and closed ended questions for students to self-assess their learning. In addition to rating their skills in several areas, students also provided qualitative comments about where they believed they needed to improve, and how they planned to strengthen their skills.24 Hannah Rempel at Oregon State University 15 Mahmood Khosrowjerdi and Mohammad Iranshahi, “Prior Knowledge and Information-Seeking Behavior of PhD and MA Students,” Library and Information Science Research 33, no. 4 (2011): 331–335, doi: 10.1016/j.lisr.2010.04.008. 16 Conway, “How Prepared are Students,” 124. 17 Kaijsa J. Calkins, “Best of the Literature: Graduate Student Instruction,” Public Services Quarterly 3, no. 3/4 (2007): 221–226, doi: 10.1080/15228950802110767. 18 Conway, “How Prepared are Students,” 124. 19 Catherine Hodgens, Marguerite C. Sendall, and Lynn Evans, “Post-Graduate Health Promotion Students Assess their Information Literacy,” Reference Services Review 40, no. 3 (2012): 408–422, http://dx.doi.org/10.1108/00907321211254670. 20 “National Information Literacy Survey,” Association of College & Research Libraries, 2001, accessed September 7, 2015, http://www.virginia.edu/surveys/Projects/ACRL/ 2001/home.htm. 21 Sonia Špiranec and Mihaela Banek Zorica, “Changing Anatomies of Information Literacy at the Postgraduate Level: Refinements of Models and Shifts in Assessment,” Nordic Journal of Information Literacy in Higher Education 4, no. 1 (2012): 12, http://ezproxy. simmons.edu:2048/login?url=https://search-ebscohost-com.ezproxy.simmons.edu/ login.aspx?direct=true&db=lls&AN=90596175&site=ehost-live&scope=site. 22 Megan Oakleaf, “Dangers and Opportunities: A Conceptual Map of Information Literacy Assessment Approaches,” portal: Libraries and the Academy 8, no. 3 (2008): 237, accessed September 7, 2015, doi: 10.1353/pla.0.0011. 23 Grant Wiggins, Educative Assessment: Designing Assessments to Inform and Improve Student Performance (San Francisco, CA: Jossey-Bass, 1998). 24 Hodgens, Sendall, and Evans, “Post-Graduate Health Promotion,” 2012.
L. Saunders et al. / The Journal of Academic Librarianship 42 (2016) 655–663
carried out longitudinal interviews with ten graduate students over the course of two years to learn how they were impacted by instruction sessions for developing literature reviews.25 Based on her findings, she makes recommendations about tailoring sessions, reaching out to beginning graduate students, and being aware of disciplinary differences. Michelle Dunaway and Michael Orblych used formative assessments through a pre-instruction exercise followed by in-class questions during instruction sessions to gauge student needs and adjust the session as necessary.26 CRITICAL INCIDENT QUESTIONNAIRES The Critical Incident Questionnaire (CIQ), a formative classroom assessment tool introduced by Stephen Brookfield, has students reflect and report on their learning by responding to several open-ended questions, usually focusing on a single instruction session.27 Brookfield builds his framework on David Tripp's exploration of critical incidents. Tripp notes that critical incidents do not have to be extraordinary, but are moments that upon reflection take on significance.28 Although the questions can be tweaked or changed, the original CIQ consists of five questions asking students when they felt most engaged and when they felt most distanced during an instruction session, and what the instructor could have done differently. The forms are filled out anonymously to ensure students feel safe in answering honestly. After the forms are completed, the instructor analyzes them to look for themes that can suggest what is working well or where there are problems with the learning. Looking at the learning process from the students' perspective can “reveal hidden assumptions practitioners may hold in relation to their teaching and their students' learning.”29 Donald Gilstrap and Jason Dupree used CIQs to assess a series of library instruction sessions with undergraduates in an English Composition course.30 Analysis of the responses showed students perceived gains in knowledge especially in relation to searching and evaluating resources. Students also tended to appreciate hands-on portions of the sessions, where they were able to apply the strategies they learned. The authors concluded that the CIQ is an effective tool for promoting critical reflection and assessing learning. Nevertheless there is little evidence of use of the CIQ in library instruction in general, and none at the graduate level. PROCEDURES The purpose of the current study was to investigate the instructional effectiveness and gauge student learning and engagement in graduatelevel library instruction sessions. Specifically, the study examine the following questions: ● What skills or competencies are graduate students learning in library instruction sessions, as evidenced by the areas in which students report feeling most engaged and affirmed?
25 Hannah Gascho Rempel, “A Longitudinal Assessment of Graduate Student Research Behavior and the Impact of Attending a Library Literature Review Workshop,” College & Research Libraries 71, no. 6 (2010): 532–547, doi: 10.5860/crl-79. 26 Michelle Kathleen Dunaway and Michael Teague Orblych, “Formative Assessment: Transforming Information Literacy Instruction,” Reference Services Review 39, no. 1 (2011): 24–41, http://dx.doi.org/10.1108/00907321111108097. 27 Stephen Brookfield, Becoming a Critically Reflective Teacher (San Francisco, CA: Jossey Bass, 1995). 28 David Tripp, Critical Incidents in Teaching: Developing Professional Judgment (London, UK: Routledge, 1993). 29 Liam Phelan, “Integrating Students' Perceptions of their Online Learning Experiences with Brookfield's Critical Incident Questionnaire,” Distance Education 33, no. 1 (2012): 32, http://dx.doi.org/10.1080/01587919.2012.667958. 30 Donald L. Gilstrap and Jason Dupree, “Assessing Learning, Critical Reflection, and Quality Educational Outcomes: The Critical Incident Questionnaire,” College & Research Libraries 69, no. 5 (2008): 407–426, doi: 10.5860/0690407.
657
o What teaching methods, topics, or examples most resonate with them? What aspects of the instruction seem most connected to learning?
● What aspects of the library instruction sessions could be improved, as evidenced by areas in which students report feeling least engaged or most confused? o What teaching methods, topics, or examples resonate least? What aspects of the instruction seem least connected to learning?
At the time of the study, all of the researcheres were employed at the same institution, which is classified as a Master's L by the Carnegie Classification of Institutions of Higher Education. Three of the researchers are practitioners working in academic libraries, one is a faculty member in the School of Library and Information Science and one is a library science student. One of the researchers is the library liaison to both the School of Nursing and Health Sciences (SNHS) and the School of Social Work (SSW), both of which offer graduate programs, up to the doctoral level. Over time, this researcher has built strong relationships with faculty members in both schools, and has been integrated into their curricula, offering library instruction sessions on a regular basis across their programs. In the summer of 2014, the Liaison Librarian reached out to her respective departments and connected with faculty who were interested in collaborating on this project for the fall semester. Three faculty members agreed to participate, two from the School of Social Work (SSW) and one from the Doctor of Physical Therapy (DPT) program. The courses included in the study were three sections of SW 441 Social Work Research and three sections of PT 631 Frameworks for PT Practice: Cardiovascular & Pulmonary Diseases, with each section receiving a single one-shot library instruction session. While by necessity this study relied on a convenience sample of courses whose faculty members were open to participation, the courses ultimately included in the study were chosen for several reasons. To begin with, the librarian had a prior history with the courses and instructors, meaning she was familiar with the curriculum, assignments, and outcomes and could better tailor the session to meet course expectations. Both courses are generally taken in the first semester of the students' program, thus offering a good opportunity to examine what information literacy skills and questions entering graduate students have. Both courses require students to find and use literature for assignments, and both are evidence-based, licensed practitioner programs. The emphasis in both cases on the need to find and use research literature both as a student and later as a professional lends itself well to library instruction. In addition, it is worth noting that both fields have language related to information literacy in their accreditation standards. Specifically, the Commission on Accreditation in Physical Therapy Education indicates that graduates should be able to access and critically analyze scientific literature, as well as identify, evaluate, and apply evidence in determining patient care.31 The Council on Social Work Education includes similar language around evidence-based practice, including that graduates analyze research methods and findings and that they use research evidence in practice, which necessarily involves being able to locate, access, and evaluate that information.32
31 “Standards and Required Elements for Accreditation of Physical Therapist Education Programs,” Commission on Accreditation in Physical Therapy Education, (November 11, 2015), http://www.capteonline.org/uploadedFiles/CAPTEorg/About_CAPTE/Resources/ Accreditation_Handbook/CAPTE_PTStandardsEvidence.pdf. 32 “2015 Educational Policy and Accreditation Standards,” Council on Social Work Education, (2015), http://www.cswe.org/File.aspx?id=81660.
658
L. Saunders et al. / The Journal of Academic Librarianship 42 (2016) 655–663
RESEARCH METHODS The researchers were interested in moving away from the pre and post-test and survey methods described in most of the literature, toward a more authentic and reflective assessment as described by Oakleaf,33 Wiggins,34 and others. Ultimately, the Critical Incident Questionnaire (CIQ) was chosen as a diagnostic tool that is relatively easy to implement, does not require too much time, and was designed to be anonymous. Further, because it requires students to reflect on their learning as they answer open-ended questions about the session, the CIQ integrates some authentic measure and self-reflective practice which Brookfield and others note can both alert the instructor to what is going well and what can be improved, and deepen the students' learning by having them engage with and think about the session in a new way.35 Although the CIQ can be adapted and tweaked, the researchers chose to use the original version presented by Brookfield. Thus, the questions provided to students included: 1. At what moment in this class did you feel most engaged with what was happening? 2. At what moment in this class did you feel most distanced from what was happening? 3. What action that anyone (teacher or student) took did you find most affirming or helpful? 4. What action that anyone (teacher or student) took did you find most puzzling or confusing? 5. What about this class surprised you the most (this could be about your own reactions to what went on, something that someone did, or anything else that occurs.)? Because the CIQs consist solely of open-ended questions, the data collected was all qualitative and subject to content analysis. As described by Krippendorff, content analysis is a systematic approach to analyzing and inferring from text documents.36 The analysis is iterative and involves multiple close readings of the text in order to identify themes and patterns. As themes are identified, they are noted down and subsequent documents are compared against the identified themes. Through multiple readings, new themes will be identified, and in some cases original themes might be discarded, or separate themes might be merged into one category. These categories comprise a codebook against which all the documents are compared, and evidence in the text is coded. Analysis is complete when no new themes emerge. In collaboration with the researchers, the library science student began analysis of the CIQs and developed an initial codebook. Throughout the process, she met regularly with team members to apprise them of progress and to get help resolving any questions or concerns. In order to increase validity, the full team then each read through and coded a sample set of documents using the same codebook. The researchers compared their codes, identified discrepancies, and worked together to reach consensus. HUMAN SUBJECTS Before the start of the fall semester, the researchers submitted a proposal and received approval for the study from the Institutional Review Board. The study participants were all students, a population that could be vulnerable to coercion if they are unaware that participation is their choice, or if they believe that their participation would impact their course grades. As such, it was important to ensure that students understood the voluntary nature of their participation and to obtain informed consent from those students who agreed to participate. It was made 33
Oakleaf, “Dangers and Opportunities” 2008. Wiggins, Educative Assessment, 1998. Brookfield, Becoming a Critically Reflective, 1995. 36 Klaus Krippendorff, Content Analysis: An Introduction to Its Methodology (Beverly Hills, CA: Sage, 1980). 34 35
clear to the students that the faculty members only agreed to open their courses to the project, but did not speak for individual students' involvement in the study. The librarian providing the instruction explained the purpose of the study, emphasizing that participation was voluntary and that the choice to participate or not would have no effect on the students' grades. All students completed the CIQ, as the researchers could use them for insight and improvement of the session even if they were not included in the study. Those students who agreed to participate in the study also filled out an informed consent form. The responses of students who agreed to participate were separated and only those responses will be reflected in this study.
FINDINGS There were a total of 116 students in the six class sections involved in this study; 42 students across three sections of PT 631 Frameworks for PT Practice: Cardiovascular & Pulmonary Diseases, and 74 students in three sections of SW 441 Social Work Research. As noted above, participation in the study was voluntary so not all students who attended the library instruction sessions were included in the study. In the end, a total of 80 students (69%) submitted signed informed consent forms along with their CIQs. The first question of the CIQ asked students to identify the moment at which they felt most engaged with the instruction. Most students pointed to general aspects of the instruction, with 8 students (10%) reporting that the entire session was engaging. Hands-on practice was by far the most popular aspect of the session, with 40 students (50%) indicating that this was the point at which they were most engaged. Next, 23 students (29%) found the instructor's demonstration, with students following along to be the most engaging moment. Several students appreciated that the instructor asked for their input on search topics. For instance, one student enjoyed “searching for something as a class and trying to figure out the best way to phrase it,” while another appreciated “When we were able to give examples of what we wanted to look up.” Four students also appreciated receiving personal feedback from the instructor during hands-on practice. Some students also highlighted specific skill areas as being engaging. Thirteen students (16%) enjoyed learning searching tips, with 7 mentioning keyword and phrase searching, 4 pointing to Boolean operators, and 2 mentioning tips for evaluating resources. Learning about RefWorks and citation management was also popular, with 12 students (15%) indicating this was the moment they felt most engaged. Fig. 1 shows the number of students indicating engagement in each area. Question two asked students when they felt most distanced from the instruction. The largest proportion (21 students, 26%) reported feeling distanced during the instructor's demonstration. One student elaborated by saying “at the beginning I was a little lost,” while another was confused by the “explanation of lit rev[iew] vs. systematic review, [and] what makes a source credible.” By contrast, the next largest group of 15 students (19%) indicated that they did not feel distanced at any point, or left that question blank suggesting that they did not feel distanced from the instruction. Eleven students (14%) reported feeling distanced when learning about RefWorks. Similarly, eleven students felt distanced during the hands-on activity. One of these students noted that her group was not “finding an article that was what we were looking for,” which apparently led to frustration and a feeling of disengagement. Nine students (11%) lacked engagement when going over topics or resources they were already familiar with, while 8 (10%) felt that the session moved too quickly. Four students (5%) felt distanced when trying to choose a research topic to search, and 3 students (4%) felt distanced when learning how to evaluate sources. Several comments did not fit into any category. For instance, one student reported feeling distanced “when other people started talking,” while another had technical issues with her computer. Fig. 2 shows the number of responses for question two.
L. Saunders et al. / The Journal of Academic Librarianship 42 (2016) 655–663
659
Fig. 1. Moment at which students felt most engaged.
Students were also asked what action that anyone took during the session they found most affirming. The majority of affirming actions were taken by the instructor. The largest proportion of students by far (37 students, 46%) indicated that the instructor's teaching and/or demonstrations throughout the entire class were the most affirming action. Within this answer, students pointed to specific parts of the instruction, such as “how to identify peer-reviewed articles,” “where we could confirm peer-reviewed sources,” and “how to limit searches.” The next most affirming action according to 16 students (20%) was the one-onone or small group help offered by the instructor, followed by the instructor answering questions (12 students, 15%). Six students (8%) also noted the instructor's attitude or teaching style as being affirming. For instance, one student appreciated the instructor's ““expressing/acknowledging that this is a complicated process.” Seven students (9%) indicated that they found the hands-on practice most affirming. Interestingly, three students (4%) enjoyed working in groups and
building camaraderie with their peers. One student noted that the students shared contact information so that they could continue to work together, while another was bolstered by “having students share frustrations to show I'm not alone.” Finally, ten students (13%) did not answer this question. Fig. 3 illustrates the number of students per response for this question. The fourth question asked students what they found most puzzling or confusing. The list of responses to this question varied more than any of the others, yet a majority of students (38 students, 48%) reported that they did not find anything confusing. By contrast, 7 students (9%) indicated that the session moved too fast making it hard to follow the demonstration. A number of students were confused by particular tools or resources. For instance, 13 students (16%) were confused by RefWorks, especially the differences between the functions of “cited by” and “citing.” Four students (5%) each found Google Scholar and systematic reviews difficult, while 3 (4%) found it hard to navigate sources,
Fig. 2. Moment at which students felt most distanced.
660
L. Saunders et al. / The Journal of Academic Librarianship 42 (2016) 655–663
Fig. 3. Most affirming action identified by students.
with PubMed and Medline mentioned specifically. Other confusing areas included choosing a database, transferring documents, finding full-text, finding relevant articles, and translating specialized search terms into “plain language,” with 2 students (3%) falling into each of these categories. Fig. 4 shows the breakdown of responses to this question. Once again, a few comments did not seem to fit into any category. For instance, one student wrote “when people were shouting out topics to search,” and another noted that she didn't realize she could email the reference desk for assistance. The final question asked students what surprised them most about the session. Fourteen students (18%) chose not to answer this question. A number of responses were grouped around the amount of information available. For instance, 14 were surprised by the number of resources and amount of information in general, while 3 said they were surprised by the amount of information and resources specific to their field. One student noted “I started getting a little overwhelmed with how many articles started to come back.” Another commented that
she was surprised to learn that she is “not the only one that struggles with research.” Students were also surprised by some of the specific tools and by searching tips. Indeed, the largest proportion of students (21, 26%) reported being surprised by RefWorks, with one respondent commenting that she was surprised by “how easy RefWorks is,” while 2 (3%) said they were surprised when learning about Google Scholar. Ten students (12.5%) were impressed by the tips and tricks they learned for searching, and 4 specifically noted learning how much search terms can affect results. Finally, 9 students (11%) said they were surprised by how easy the process is, and 4 (5%) indicated feeling surprised by how much their skills improved. Fig. 5 shows the number of responses to this question. One surprising and unanticipated result of this study was the areas in which student responses conflicted with each other. While in some cases, responses overlapped and students seemed to agree with each other about what aspects of the instruction sessions were engaging, affirming, or puzzling, in other cases the answers contradicted each
Fig. 4. Most confusing or puzzling moment identified by students.
L. Saunders et al. / The Journal of Academic Librarianship 42 (2016) 655–663
661
Fig. 5. Most surprising aspect identified by students.
other. In certain instances, similar proportions of students reported feeling most engaged with aspects of the course that other students indicated were least engaging or most puzzling. Fig. 6 shows the breakdown of overlapping and conflicting answers. As the figure illustrates, 23 students said that the demonstration and lecture were the most engaging part of the session, while 21 students found it to be the least engaging. Similarly, 12 students reported that learning about citation tools was
most engaging while 11 found it to be the least engaging portion of the session. DISCUSSION The students' responses to the CIQs revealed some interesting and suggestive patterns. A number of students expressed engagement
Fig. 6. Overlapping and conflicting feedback.
662
L. Saunders et al. / The Journal of Academic Librarianship 42 (2016) 655–663
with and appreciation of instruction in search tips including keyword and phrase searching, Boolean searching, and database navigation, as well as citation software such as RefWorks. One could assume that if students already knew how to conduct such searches or use such tools, they would feel disengaged or even bored during that part of the instruction. In all, these responses seem to support earlier studies' findings that graduate students do not always come equipped with these information literacy competencies, and that they need explicit instruction in searching, citation, and specific issues such as identifying peer-reviewed articles, narrowing topics, and finding full-text documents.37 Further, some students indicated still feeling puzzled or confused by some of the resources at the end of the session, suggesting not only that they did not already have these skills, but also that a single one-shot session might not be enough to teach all the requisite skills for all students. The importance of information literacy instruction for graduate students is further borne out by some of the student comments. In particular, some students found it affirming to learn that others were confused or challenged by the research process, and appreciated the instructor's acknowledgment that the process can be difficult and time-consuming. These students commented that they were happy to learn that they were not alone and that others were experiencing the same frustrations. These responses affirm that even graduate students experience difficulty conducting research and appreciate the support of instruction. The fact that some students seem surprised that others experience difficulty further suggests that students might be reluctant to admit that they are confused or need help as they believe that they are the only ones experiencing such trouble. Some of the findings were less clear, or even somewhat contradictory. Although not necessarily written this way purposely, questions 1 (when did you feel most engaged) and 3 (what action was most affirming) and questions 2 (when did you feel most distanced) and 4 (What action was most puzzling or confusing), could reinforce one another. For instance, one might expect that a student will feel distanced from the same topics that they found puzzling or confusing, or that they might feel affirmed by the parts of the session in which they were most engaged, but that did not seem to be the case for most students. Although some of the same themes emerged in questions 1 and 3 as well as questions 2 and 4 these themes often applied differently to different individuals. In other words, different students reported being engaged during the entire session than reported finding the whole session affirming. Students who indicated that they were distanced when learning about RefWorks or when the session moved too quickly did not necessarily report that they were also confused or puzzled by those areas. Hands-on experience is one area that did show some overlap; many of the same students who found that part of the session engaging also indicated that it was affirming, which coincides with Gilstrap and Dupree's findings.38 The contradictory answers to which portions of the instruction were engaging or distancing is also notable, and raises some interesting questions. As noted above, while half of students found the hands-on practice engaging, 14% said they felt most distanced during that portion. The number of students reporting that they were engaged or distanced by the instructor's demonstration were nearly equal, with 29% feeling engaged and 26% feeling distanced. It is difficult to say why some students would feel distanced by the same part of the session that others found engaging. Indeed, a few students noted that the session moved too quickly. If those students were having trouble following along, it might explain why they felt distanced during that portion. The number of students indicating that the session moved too quickly is small, however, and probably does not wholly explain the difference. Another 37
See, e.g., Conway, “How Prepared are Students.” Catalano, “Patterns of Graduate Students'.” 38 Gilstrap and Dupree, “Assessing Learning, Critical Reflection, and Quality Educational Outcomes: The Critical Incident Questionnaire.”
possible explanation for the variance is differences in student learning styles. It may be that active or kinesthetic learners preferred the hands-on portion where they were able to search and explore on their own, while visual and audio learners preferred the instructor's demonstration where they could see and hear the explanations of how best to search. Whatever the reason for these contradictory responses, they are important as they demonstrate that there is no “one size fits all” approach to library instruction. Rather, librarians need to be aware that individual students will respond differently to different teaching methods and activities. To accommodate multiple learning styles and maximize student engagement, librarians can use a combination of instructor demonstration and hands-on practice. Participants noted that it is also helpful for the instructor to actively offer feedback, answer questions, and tailor the instruction session to fit students' needs. Indeed, Alfred Rovai notes the importance of both active, or constructivist, approaches to instruction as well as the instructor's communication style for effective learning.39 Similarly, he emphasizes the importance of communitybuilding, which is reflected in some of the student comments about being reassured that others in the class were experiencing similar issues in the research process. Disparities in students' previous experience is another area where individuals varied widely. In discussing what moment in the session was most distancing, one student remarked, “I also know how to do most of what we went over from undergrad,” while another student stated, “I don't think I've ever received a lesson on how to properly search for research articles on the Internet.” A third student answered the question of what most was most surprising about the session by stating “how few people had interacted with research before.” Balancing this range of experience is an ongoing challenge. Some students who have had library instruction in the past might appreciate reinforcing certain concepts, and indeed such repetition can be beneficial given Khosrowjerdi and Iranshahi's findings that past experience correlates with information literacy.40 However, other students doubtless will be bored if they perceive the session as being redundant. In the meantime, those students who have not had prior library instruction might need more time or some repetition to absorb new information. It is possible that some students would benefit from multiple instruction sessions or additional workshops that are tied to a specific research need. Participants who left the session still confused on some topics, such as how to use particular databases might also find supplementary instruction useful. One-on-one research consultations might be another opportunity for students who have questions after a library instruction session. Librarians can introduce concepts to the group in the session, and then remind students that they are welcome to visit the reference desk or make an appointment with a librarian for further clarification and assistance. These alternatives mean that the class does not necessarily have to stop if most people have grasped a concept, and it also allows the student with outstanding questions to receive the librarian's undivided attention to go more in-depth into any areas of confusion. Indeed, the librarians involved in this project report that requests for research consultations have increased exponentially in the last couple of years. CONCLUSION Critical Incident Questionnaires proved to be a rich source of data for this assessment study. The open-ended questions required the students to reflect on their learning, bringing an element of meta-cognition to the session, and also provided the researchers with more depth and context 39 Alfred P. Rovai, “The Relationships of Communicator Style, Personality-Based Learning Style, and Classroom Community among Online Graduate Students,” The Internet and Higher Education 6, no. 4 (2003): 347–363, doi: 10.1016/j.iheduc.2003.07.004. 40 Khosrowjerdi and Iranshahi, “Prior Knowledge and Information-Seeking Behavior of PhD and MA Students.”
L. Saunders et al. / The Journal of Academic Librarianship 42 (2016) 655–663
than quantitative surveys or tests might have. That said, this study used the original version of the Critical Incident Questionnaire, which asks very general questions. A future study might revise the questions to address the session's specific content, or to probe student answers to initial questions by asking why they felt engaged or distanced. Participants in this study reported many separate parts of their instruction session to be engaging, with 10% indicating engagement throughout the entire session. If students already possessed the knowledge being taught, they likely would not have experienced such engagement. The findings support those of previous studies that graduate students do indeed benefit from and often appreciate instruction for information literacy. Nevertheless, the study also suggests that no single approach to instruction will accommodate all students. Rather, librarians need to be aware of different learning styles and the range of students' previous experience when planning instruction and integrate various methods and activities to reach students where they are. While not always feasible, a brief pre-test might help the librarian better gauge student knowledge and tailor the session appropriately. Librarians might also take advantage of the disparities in students' previous exposure to library instruction by engaging in peer tutoring and pairing more experienced students with students who have less exposure to
663
library instruction. Such an approach might not feel as redundant for more experienced students, but would still allow inexperienced students a chance to practice skills and ask questions. Also, research suggests that teaching reinforces learning, meaning that experienced students would further benefit from being in the role of teacher.41 Perhaps most importantly, the findings of this study indicate that students appreciated not only the knowledge they gained through the instruction but also the sense of community and confidence it engendered. Specifically, students were relieved to find that they were not alone in experiencing confusion or frustration with some aspects of their research, and one student even noted that all of the members of her group shared contact information with one another after the session. Further, students appreciated the support and affirmation of the library instructor. One student even characterizing the session as “empowering.” The results of this study belie the assumption that graduate students have honed their information literacy skills through their prior education and suggest that these students benefit from information literacy instruction.
41 Safiye Aslan, “Is Learning by Teaching Effective in Gaining 21st Century Skills? The View of Pre-Service Science Teachers,” Educational Sciences: Theory & Practice 15, no. 6 (2015): 1441–1457.