Development and Validation of the Metacognitive Strategies for Library Research Skills Scale (MS-LRSS)

Development and Validation of the Metacognitive Strategies for Library Research Skills Scale (MS-LRSS)

The Journal of Academic Librarianship 43 (2017) 178–183 Contents lists available at ScienceDirect The Journal of Academic Librarianship journal home...

172KB Sizes 3 Downloads 106 Views

The Journal of Academic Librarianship 43 (2017) 178–183

Contents lists available at ScienceDirect

The Journal of Academic Librarianship journal homepage: www.elsevier.com/locate/jacalib

Development and Validation of the Metacognitive Strategies for Library Research Skills Scale (MS-LRSS)

MARK

Amy Catalano Hofstra University, 026 Hagedorn Hall, Hempstead TPK, Hempstead, NY 11549, United States

A B S T R A C T The possession of metacognitive strategies can contribute to successful library research experiences. Accordingly, these skills should be explicitly taught during library instruction. In order to facilitate further examination of student metacognitive strategies, the Metacognitive strategies for Library Research Skills Scale (MS-LRSS) was developed and deployed to 224 graduate and undergraduate students at two post-secondary institutions. Development and validation of this instrument is described in this article. Overall, this instrument demonstrated to be a valid and reliable measure of metacognitive strategies in the context of information literacy and library research.

Introduction A student's ability to think metacognitively, that is think about his or her own thinking and to develop strategies to plan and monitor his or her own learning, contributes to improved academic performance. To be metacognitively aware students must have knowledge about the conditions and strategies that work best for them. Students fresh out of high school often do not possess metacognitive awareness in that they cannot “realistically assess what they do and do not know” (Santamaría & Petrik, 2012, p. 265). Santamaria and Petrik clearly identify the need for metacognitive skills when information searching by noting that new students, when coming to the library for research help, do not know what questions to ask to get started. Even if they have had basic library instruction, they do not know enough about the potential sources that may be of use to them to ask for access. Evaluating and discerning quality in resources beyond what is accessed through the open internet is secondary to articulating one's needs. Because of this inability to define their information needs, these students are also unlikely to ask for help (Gross & Latham, 2007). Accordingly, students need to be explicitly taught metacognitive strategies and how to apply them to the research process. Because of the importance of metacognitive strategies to successful information searches and information literacy skills, the author developed an instrument intended to assess these skills in the context of metacognitive awareness as none presently exist. Although several reliable and valid instruments related to non-discipline specific metacognitive skills have been used successfully over several decades, information literacy and library skills are not necessarily well-devel-

oped even in students who are metacognitively aware and can demonstrate good strategies for studying. An instrument that can directly assess metacognitive strategies as they relate to library skills can assist instructors in identifying gaps in these abilities and strategies. Very little literature exists on metacognition and information literacy or library skills. The metacognition literature of most relevance to library instructors focuses on the processes engaged during searches. Therefore, the literature review will begin with research on metacognition and searching, followed by how metacognitive awareness impacts student success, and ending with a review of the relevant measures upon which this study was based. Metacognition and searching Metacognitive knowledge can be delineated by three processes: declarative knowledge (knowing the skills one possesses; the factual knowledge a person needs prior to thinking about a topic critically), procedural knowledge (knowing how to use processes and strategies), and conditional knowledge (“knowledge about when and why to use strategies”; Schraw & Dennison, 1994, p. 460). Regulation, how students plan their learning, employ strategies, monitor, correct, and evaluate their learning, is another aspect of metacognitive awareness. A body of research examines how users employ metacognitive strategies during information searches. For example, Tabatabai and Shore (2005) explored the differences between web searchers of varying experience levels, from novices (preservice undergraduate teachers) to intermediates (Master's students in library and information science) to experts (professional librarians). By coding the verbal protocols that resulted

E-mail address: [email protected]. http://dx.doi.org/10.1016/j.acalib.2017.02.017 Received 11 August 2016; Received in revised form 1 February 2017; Accepted 7 February 2017 Available online 12 April 2017 0099-1333/ © 2017 Elsevier Inc. All rights reserved.

The Journal of Academic Librarianship 43 (2017) 178–183

A. Catalano

from search sessions in which participants were asked to find answers to questions, the authors found that the most significant differences between novices and experts were metacognitive, cognitive, and prior knowledge strategies. More specifically reflecting on strategies, planning and monitor (and adapting unsuccessful search strategies) as well as identifying criteria for evaluating information contributed to a successful search. Tsai and Tsai (2003) also confirm that metacognition is crucial to a successful outcome when searching online. Information behavior research illustrates the cognitive demands on information searchers, particularly when searching databases (Ellis, 1989; Kuhlthau, 1991). Ahmed, McKnight, and Oppenheim (2009) posited that database designers could mitigate this load by integrating “human computer interface techniques” into database interfaces (Blummer & Kenton, 2014, p. 14). In the interim, teaching users metacognitive skills to employ while searching for information has the potential to improve users' experiences. Blummer and Kenton present a framework for instruction through a tutorial they created called the Metacognitive Idea Tactics tutorial. The tutorial provides students with information on how to activate metacognitive skills including planning, monitoring, and self- regulation while searching. The tutorial also provides tips on how to improve database searching such as using Boolean operators, using limiters, and the differences between keyword vs subject searching, for example. In one example of how the tutorial works, users are invited to select a tactic based on their needs: Are they retrieving too many articles? Are they not retrieving relevant articles? How does one evaluate his or her results? After selecting a tactic, a user is presented with strategies on how to address the search issue. The tutorial then demonstrates how an expert might employ a search strategy to address the search issue by using screen shots of the expert's search strategy in a “think aloud” format, that is the screen shot of the search strategy includes text that explains the expert's decision making using metacognitive strategies.

information needs ahead of time. Therefore, examining and identifying the metacognitive skills of students will allow professors to target instruction to student needs in this area. A review of metacognition instruments Many instruments exist that measure metacognitive awareness, however, the following measures were identified as the most widely used, and most stable and reliable. The Metacognitive Awareness Inventory (MAI; Schraw & Dennison, 1994) is a 52-item inventory developed to measure the metacognitive awareness of adults. Items were organized into components under two factors: knowledge of cognition and regulation of cognition. The scale was deemed reliable with a Cronbach's alpha (CA) of 0.90. The authors administered the inventory in two separate experiments, in order to establish validity, to a total of 307 undergraduate students. Data analysis revealed that knowledge of cognition was related to monitoring ability, although not related to monitoring accuracy. The State Metacognitive Inventory (SMI; O'Neil & Abedi, 1996) was developed to be domain independent, that is the questions are not related to a specific discipline although they are answered in the context of a specific learning task. They administered their scale to high school seniors to establish reliability and validity of the measure. Each subscale, consisting of 5 items, loaded onto one factor each. Reliability was above alpha 0.70, thereby demonstrating reasonable reliability. To establish construct validity the authors investigated the relationship between achievement, as measured by National Assessment of Educational Progress test scores, and scores on the SMI. They found low but significant correlations between the two constructs. Metacognition and achievement tended to demonstrate a stronger relationship in older students. Other scales measuring metacognition but not used as framework for the current study included the widely used Motivated Strategies for Learning Questionnaire (MSLQ), developed by Pintrich, Smith, Garcia, and Mckeachie (1993), which assesses motivational and metacognitive behaviors; the Metacognitive Thinking Skills Scale, developed by Tuncer and Kaysi (2013); and the Metacognition Assessment Interview protocol (Semerari et al., 2012), which is an adaptation of the MAS. The MAI is used to assess metacognition in relation to a real-life episode experienced by the participant. Despite the fact these measures have predominantly demonstrated good validity and reliability, they were developed to measure student's metacognitive awareness as it relates to academic work in general. Because no specific measure presently exists that incorporates library research skills, the following section describes the development of an instrument created to assess metacognitive strategies with respect to these specific abilities.

Metacognition and student success Some researchers have found that the ability to demonstrate metacognitive awareness and to employ metacognitive strategies contributes significantly to academic success (e.g., Dunning, Johnson, Ehrlinger, & Kruger, 2003). Metacognitively aware students are more likely to devise and follow a plan in their academic pursuits, allowing them to perform better. However, other researchers, having long studied the relationship between metacognition and academic achievement, found mixed results. For example, some aspects of metacognition, such as regulatory skills, are not necessarily related to proficiency in content knowledge (Glenberg & Epstein, 1987). Coutinho (2007) examined the relationship between goals, metacognition, and achievement by surveying 179 undergraduate students of varying ages using a goal orientation scale, the Metacognitive Awareness Inventory (MAI; Schraw & Dennison, 1994), and a demographic questionnaire in which students were asked to report their GPA. Although self-reported GPA has limited reliability, the data were normally distributed. Strong, positive correlations were found between metacognition and mastery goals, and mastery goals had a moderate correlation with GPA. Lastly, metacognition had a weak but positive correlation with GPA. In a larger study, King and McInerney (2016) examined the relationship between goals, metacognition, and academic achievement in 8773 students in Hong Kong. The researchers found a strong relationship between goals and metacognition but neither had an impact achievement. Students with high achievement, however, were more likely to employ metacognitive strategies, thereby highlighting that the relationship between these three variables, while not reciprocal, are related. Students who are deficient in metacognition can be taught specific strategies to improve their learning. For example, teaching students how to plan their research projects will help them to identify their

Methods Instrument development A review of the literature allowed the researcher to identify several reliable and widely used metacognition scales, as noted above, that were used as a framework for the development of the Metacognitive Strategies for Library Research Skills Scale (MS-LRSS). Items from the MS-LRSS were written using the SMI (O'Neil & Abedi, 1996) and the MAI (Schraw & Dennison, 1994) as a framework. The SMI includes the subscales awareness, cognitive strategy, planning, and self-checking. The MAI includes the subscales procedural knowledge, information management strategies, debugging strategies, and planning. These instruments were used as a template for organizing the subscales and to structure wording of the items for the present study because their structure most closely reflects those procedures involved in the library research process. The concepts covered by the MS-LRSS were based on Marcia Bates' (1979) Idea Tactics and Catts's (2005, 2010) Information 179

The Journal of Academic Librarianship 43 (2017) 178–183

A. Catalano

Table 1 Subscale names, items, and loading coefficients. Question number on initial scale

Subscales

Factor loading coefficient

Awareness A2 A1 A3 A7 A5 A6 A4

I I I I I I I

Self-checking and debugging C4 C2 C3 D1 D3 D6

Section Section Section Section Section Section

4-I keep track of my search strategies 4-I ask myself if I have consulted all possible resources 4-I analyze the usefulness of my strategies 5-I ask for help when I can't find a source that I need 5-When I find a source and am unsure of its quality, I look for another source to corroborate the first 5-If some aspect of my research isn't working out, I look at it from another perspective

0.791 0.771 0.745 0.644 0.622 0.579

Planning P2 P5 P3 P4

Section Section Section Section

3-I 3-I 3-I 3-I

0.814 0.809 0.789 0.751

Cognitive strategy C9 C8 C7 C11

Section Section Section Section

2-I evaluate the materials I retrieve 2-I scan information in a source after I retrieve it 2-If I retrieve too many irrelevant results from a search, I revise my strategy 2-I examine sources for clues to point me toward other sources

am aware of the steps needed to find sources for my project am aware of how to create an effective search strategy know how to determine whether a source is reliable am aware of when my searches are unproductive am aware of the need to evaluate each source before using it know how to present the my research in medium that is appropriate to the audience am aware of the need to understand the assignment before beginning my research

try to determine what my professor wants before beginning my research try to understand the assignment before I start my research think about what I need to accomplish before beginning my search for sources make sure I understand what has to be done and how to do it

0.814 0.801 0.784 0.744 0.743 0.718 0.676

0.765 0.738 0.657 0.633

questionnaire. Prior to filling out the scale students were instructed to read a statement that noted that they should base their responses on a research project they have been assigned.

Skills Scale. Idea Tactics includes nine tactics and three strategies intended to facilitate the development of a student's metacognitive skills. These tactics and strategies include, Think, Catch, Notice, Meditate, Change, Create, Wander, Jolt, Identify, Break, Regulate, and Skip. The Information Skills Scale is a self-assessment of one's information literacy. The initial version of the scale included 30 items on five subscales: awareness, planning, cognitive strategies, self-checking/monitoring, and debugging. After three experts reviewed the scale (see below) additional items were added and some item-wording was changed for clarity and consistency. Items written for each subscale represented major information literacy skills as defined by the ACRL standards such as the ability to plan a search, the ability to evaluate a source, the ability to recognize when a search strategy should be revised, knowledge of plagiarism policies, and knowledge of authoritative sources and people. See Table 1 for examples of items written for each subscale. Students indicate their agreement with statements based on a 5-point Likert scale from Not at all (1) to Extremely (5). After the initial scale was developed the author asked three experts, two librarians with over ten years of information literacy instruction experience, and one faculty member who is an expert on metacognition, to review and provide feedback on the instrument. After this review, revisions were made to the scale resulting in some modified wording and restructuring of the Likert scale. This review also served to establish face and validity. Further, by aligning items of the MS-LRSS with items on already established and valid instruments revealed in the literature, content validity was supported. Unlike the State Metacognitive Inventory, for the MS-LRSS the respondent rates their agreement with statements based on a domain dependent task. A domain dependent task asks the respondent to consider their responses in the context of a specific assignment. In this case skills related to information searching and information literacy in the context of conducting a research project requiring library resources were assessed. In order to authentically evaluate student metacognitive strategies in the context of library skills, only students who were in a credit-bearing library course (Introduction to Information Literacy) or who were in a one-shot library instruction class preparing to write a research paper using library sources were asked to fill out the

Construct validation procedures The MS-LRSS was created using Qualtrix survey software and was deployed at the end of each instruction section via a link posted on a Libguide created for the class. Students in the credit-bearing class, which was a distance learning course, were instructed to take the survey during the second to last week of class when all learning modules were completed. The scale was deployed to a total of 224 students at two private post-secondary institutions. For the initial version of the scale only 197 instruments were completed fully, and therefore, valid for the initial analysis. Prior to conducting the factor analysis two tests were employed to determine whether factor analysis was appropriate to use on this sample of data. The Kaiser-Meyer Olkin (KMO) Sampling adequacy demonstrated a value of 0.91, and Bartlet's test of Sphericity was significant at 2925.157 df 210. Both of these values were at appropriate thresholds, and therefore, indicated that the analyses could proceed. After all completed instruments were collected, an exploratory factor analysis (EFA) using principal components analysis (PCA) with varimax rotation was performed on the data. Eigen values were set a higher than 1.0, therefore all subscales were 1.0 or above. Items that loaded on coefficients < 0.50 were removed from the scale. Hair et al. (1998) notes that a for a sample of 200 participants the loading cut off should be above 0.40. The first analysis resulted in 27 items loading on 6 factors accounting for 65.17% of the variance. Reliability analysis revealed a cronbach's alpha (CA) of 0.93. Any CA above 0.70 is considered reasonable, a CA above 0.90 indicates excellent reliability. Because the scale was developed on five theoretical components, an additional analysis was run using PCA with varimax rotation, forcing the results into five factors. This analysis revealed 23 items on a five factor model explaining 63.5% of the variance, with a CA of 0.96. Because the scale demonstrated excellent reliability but included items that did not theoretically fit with the subscale upon which they loaded, 180

The Journal of Academic Librarianship 43 (2017) 178–183

A. Catalano

two of those items were removed. At this point factors loaded at or above 0.70. A final analysis was conducted to verify the structure after the two items were removed, which resulted in a four factor model that collapsed self-monitoring and debugging into one subscale. Self-monitoring and debugging are similar constructs where the strategies are aimed at adjusting approaches when a research plan is not effective, therefore putting these two factors together makes theoretical sense. This last analysis revealed a model that explained 68.58% of the variance -for scales, the higher the variance the better the items on the scale represent the construct being measured. The remaining 21 items collectively demonstrated a CA of 0.94. This final analysis included 209 cases. On some surveys some questions were answered while others were left blank, which explains why there are different numbers of fully-filled out surveys for the different versions of the scale. The first subscale, awareness (7 items), explained 44.77% of the variance. The remaining factors included self-checking/monitoring and debugging (6 items), planning (4 items), and cognitive strategy (4 items). See Table 1 for items for each subscale, items and loadings. In conclusion the MS-LRSS has demonstrated to be a reasonably valid and reliable instrument that may be used to assess the metacognitive skills of students actively engaged in library research projects.

Table 3 Subscale means by class standing.

Planning

Self-checking

Awareness

Cognitive strategy

Analysis of responses On the basis of the responses that were a result of the final model of the scale, further analyses were conducted to determine whether there were significant differences in scores between students had participated in a one credit library research course, those who had a one-shot instruction session, and those who had a reference interview only. The researcher also investigated whether there were significant differences between class standings as, in theory, students of higher class standings, would likely have more research experience. As noted in the literature review, some researchers have indicated that metacognitive awareness and academic achievement tend to be related in older students. A total of 209 completed instruments from the final version were analyzed. The 21 items on a five-point scale resulted in a possible range of scores from 0 to 105. From the 209 responses, the actual range of scores was 42–105. The mean score was 84.06, the median score was 85. The standard deviation was 12.476. A histogram was constructed using the total scores which demonstrated that the data were normally distributed. The range of scores for each subscale is reported in the “Total” row of Table 3. Most of the participants were freshmen (n = 74). Table 2 reports the mean total scores by class standing. Sophomores scored the highest on the MS-LRSS with a mean score of 91.3, although the sample was very small (n = 15). Juniors (n = 19) and doctoral students (n = 30) scored the lowest at 80.58 and 80.73 respectively. An ANOVA did not reveal the differences between class standings on total scores to be statistically significant. Additionally, the subgroups were too unevenly distributed and too small to infer generalizable results. With respect to instruction, a total of 207 students answered the question as to whether and what type of interactions they had with librarians in an instruction setting. Eighty-four reported that the one

Freshman Sophomore Junior Senior Graduate/masters Graduate/doctoral Total

Mean

Std. deviation

Maximum

74 15 19 37 34 30 209

84.2432 91.3333 80.5789 84.5135 84.8235 80.7333 84.0574

13.61386 13.37731 13.36794 9.95440 12.96258 9.51563 12.47600

105.00 105.00 105.00 105.00 105.00 96.00 105.00

Mean

Std. deviation

N

Freshman Sophomore Junior Senior Graduate/masters Graduate/doctoral Total Freshman Sophomore Junior Senior Graduate/masters Graduate/doctoral Total Freshman Sophomore Junior Senior Graduate/masters Graduate/doctoral Total Freshman Sophomore Junior Senior Graduate/masters Graduate/doctoral Total

15.7838 17.0667 15.8947 16.7568 17.0000 16.1000 16.3014 23.1351 24.7333 21.0526 21.1892 23.3235 19.6667 22.2488 29.3784 32.2000 28.3158 30.1081 28.5000 28.2000 29.3014 15.9459 17.3333 15.3158 16.4595 16.0000 16.7667 16.2057

3.04429 3.36933 3.10724 2.65000 3.07482 2.39756 2.94012 4.22428 4.71270 5.04946 4.81224 5.09141 4.36549 4.79035 5.68994 4.45934 4.54670 3.08926 4.54106 3.93394 4.76062 3.11379 2.69037 2.86846 2.29243 2.61696 2.60878 2.79264

74 15 19 37 34 30 209 74 15 19 37 34 30 209 74 15 19 37 34 30 209 74 15 19 37 34 30 209

shot library session they presently were in (the session in which they were responding to the survey) was the first library instruction experience that they had. Seventy-three students indicated that they had participated in other library instruction previously, and 26 took the credit bearing information literacy course. Twenty-four students reported that they had a reference interaction of at least 15 min even if they had not had formal instruction previously. An ANOVA did not reveal any significant differences on total scores between students in these different library instruction settings. Because each subscale had a different number of items the potential range of scores varied. The greatest difference in scores was for the planning subscale in which doctoral students scored lower on this factor than sophomores. Sophomores, however, scored the highest on all subscales. Table 3 reports the breakdown of scores on all subscales by class standing. An ANOVA was run on each subscale total score in order compare differences between class standings. Post hoc tests, which allows the researcher to analyze what impacts the significant differences in more detail, were run as well. The post hoc tests revealed small differences between class standings on the subscale scores. For example, there were differences between all class standings and doctoral students on the planning subscales. These results can be explained by the fact that doctoral students demonstrated a lower mean score than all other class standings, except for juniors. As mentioned previously the groups sizes were too diverse to draw any meaningful conclusions from them. With respect to instruction (participation in a credit bearing class, participation in a one shot library session, or participation in a reference consultation), the scores on the subscales of cognitive strategy and awareness revealed no significant differences between instructional settings. For self-checking there were significant differences between groups who had no instruction and those who had one shot library instruction previously (likely groups had one session versus groups that had two or more; p < 0.05). These results indicate that self-checking might be mitigated by strategies explicitly taught, such as evaluating and re-evaluating sources, looking at sources from another perspective, and visiting alternate sources. Asking for help is a component of selfchecking and, as noted in the literature review, many students do not know what they don't know or when to ask for help.

Table 2 Total score by class standing. N

What is your status?

181

The Journal of Academic Librarianship 43 (2017) 178–183

A. Catalano

Discussion

Implications for future research

This article reports the development and validation of a scale intended to measure the metacognitive strategies of post-secondary students at all levels while conducting library research. The results of factor analyses and reliability tests indicate that the scale demonstrates excellent reliability, and reasonable construct and content validity. Analysis of the student scores on the MS-LRSS revealed small, but significant differences between sophomores and other class standings on the total scores and the subscales awareness and self-checking. It is possible that the sophomores were from one cohort of students in a particular program who had more experience with library searches than others. To some degree instruction was a contributing factor to student scores. Although there were no significant differences between students who had different types of library instruction on the total scores, students scored higher on the planning subscale if they had a reference interaction than if they had one session. This may be explained by the tailored instruction that comes with a one-on-one interaction. When librarians demonstrate metacognitive strategies by thinking aloud while they search, that is employing expert modeling of how to solve research problems, students tend to transfer that information more effectively. In order to improve a student's success while completing a research project that requires library resources, information literacy instruction should include teaching metacognitive strategies. Expert think-alouds while searching is one method to impart these strategies to students. Students should also be taught the questions to ask themselves while searching. Search success is context dependent. No one method is right for every inquiry. However, giving users a toolkit of strategies, and an indicator as to where their use is most beneficial, is one method of facilitating successful searches. Additionally, failure to ask for help may be indicative of a student's class standing. Freshmen may be less likely to ask for help or to even know whom to ask for help. Asking for help is a “debugging” strategy, however. Instruction should focus, in part, on teaching students how to ask the questions that will facilitate a successful search for information, and more importantly, where to go for help. Use of metacognitive skills can make up for a lack of problemsolving ability (Blummer & Kenton, 2014). Additionally, some aspects of the MS-LRSS assesses student knowledge of IL skills. These skills are often explicitly taught and therefore tend to accumulate as a student progresses through his or her education by proxy of writing research papers. The results of this study does not confirm this statement, however, since younger students (sophomores) scored higher than all groups, including doctoral students on the scale. It is important to note that the sophomores were a small group and could have represented a group within a group, in that they may have been from one course who had more training than others on information searching. Lastly, there is a conflicting body of research that has suggested that some aspects of metacognitive awareness develop over time. For example, Justice and Dornan (2001) found that while traditional and non-traditional students were similarly motivated to do well in school, metacognitive strategies did not differ significantly between groups, although older students did tend to adopt high level strategies more frequently. Those results, as well, were not borne out in the present study.

In order to confirm the validity of the MS-LRSS, an additional administration of the scale should be conducted on a larger sample. Further, predictive and concurrent validity should be established by comparing scores on the MS-LRSS to scores on other instruments measuring similar constructs. For example, students might take a valid IL test such as Information Literacy Test developed at James Madison University (Cameron, Wise, & Lottridge, 2007) and Pintrich et al.'s (1993) MLSQ, and compare the scores achieved on those measures with scores attained on the MS-LRSS. Additional research should also include assessing interventions aimed at improving student metacognition strategies with the MS-LRSS. Tools such as the Idea Tactics Tutorial (Blummer & Kenton, 2014) is one such intervention. The MS-LRSS may be used a pre/post-test to evaluate differences on scores before and after the intervention. Most research on metacognitive strategies and information problem solving is focused primarily on elementary students. Further, Blummer and Kenton (2014) note that while there is an abundance of literature on the ISB of various groups, the research on graduate students is very thin. ISB research on graduate students tends to focus on source selection and IL standard competency, and less about the actual research process with the exception of a notable few (e.g., Bruce, 1999). In understanding the research processes of different classes of students and users, instructional activities, user interfaces, and reference interactions can be tailored to each of these groups' needs. References Ahmed, Z., McKnight, C., & Oppenheim, C. (2009). A review of research on humancomputer interfaces for online information retrieval systems. The Electronic Library, 27(1), 96–116 http://doi.org/10.1108/02640470910934623. Bates, M. (1979). Idea tactics. Journal of the American Society for Information Science, 30, 5. Blummer, B., & Kenton, J. M. (2014). Improving student information search: A metacognitive approach. NY: Chandos. Bruce, C. S. (1999). Workplace experiences of information literacy. International Journal of Information Management, 19(1), 33–47. http://doi.org/10.1016/S0268-4012(98) 00045-0. Cameron, L., Wise, S. L., & Lottridge, S. M. (2007). The development and validation of the Information Literacy Test. College & Research Libraries, 68(3), 229–236. Catts, R. (2005). Information skills survey for assessment of information literacy in higher education. Canberra: Council of Australian University Libraries. Catts, R. (2010). UNESCO information literacy indicators: Validation report. Retrieved from http://www.unesco.org/new/fileadmin/MULTIMEDIA/HQ/CI/CI/pdf/ information_literacy_indicators_validation_report_ralph_catts_en.pdf. Coutinho, S. A. (2007). The relationship between goals, metacognition, and academic success. Educate ~, 7(1), 39–47. Dunning, D., Johnson, K., Ehrlinger, J., & Kruger, J. (2003). Why people fail to recognize their own incompetence. Current Directions in Psychological Science, 12(3), 83–87. http://doi.org/10.1111/1467-8721.01235. Ellis, D. (1989). A behavioural approach to information retrieval system design. Journal of Documentation, 45(3), 171–212. Glenberg, A. M., & Epstein, W. (1987). Inexpert calibration of comprehension. Memory & Cognition, 15(1), 84–93. Gross, M., & Latham, D. (2007). Attaining information literacy: An investigation of the relationship between skill level, self-estimates of skill, and library anxiety. Library and Information Science Research, 29(3), 332–353. http://doi.org/10.1016/j.lisr.2007.04. 012. Hair, J. F., Tatham, R. L., Anderson, R. E., & Black, W. (1998). Multivariate data analysis. London: Prentice-Hall. Justice, E. M., & Dornan, T. M. (2001). Metacognitive differences between traditional-age and nontraditional-age college students. Adult Education Quarterly, 51(3), 236–249. http://doi.org/10.1177/074171360105100305. King, R. B., & McInerney, D. M. (2016). Do goals lead to outcomes or can it be the other way around?: Causal ordering of mastery goals, metacognitive strategies, and achievement. British Journal of Educational Psychology, 86(2), 296–312. http://doi. org/10.1111/bjep.12107. Kuhlthau, C. C. (1991). Inside the search process: Information seeking from the user's perspective. Journal of the American Society for Information Science, 42(5), 361. O'Neil, H. F., & Abedi, J. (1996). Reliability and validity of a state metacognitive inventory: Potential for alternative assessment. The Journal of Educational Research, 89(4), 234–245. Pintrich, P. R., Smith, D. A. F., Garcia, T., & Mckeachie, W. J. (1993). Reliability and predictive validity of the Motivated Strategies for Learning Questionnaire (MSLQ). Educational and Psychological Measurement, 53(3), 801–813 http://doi.org/10.1177/ 0013164493053003024. Santamaría, M., & Petrik, D. (2012). Cornering the information market. Metacognition

Limitations The results of these analyses are limited in that the sample only included 209 completed surveys. Given that there were six groups of class standings, the sample sizes of these groups were very small and unevenly distributed. Although sophomores scored the highest on the MS-LRSS, there were only 15 students in this group. It is possible that these 15 students were in the same course and had had library instruction previously more often those in the other groups. 182

The Journal of Academic Librarianship 43 (2017) 178–183

A. Catalano

and Information Science Research, 27(2), 222–248. http://doi.org/10.1016/j.lisr.2005. 01.005. Tsai, M., & Tsai, C. (2003). Information searching strategies in web-based science learning: The role of internet self-efficacy. Innovations in Education & Teaching International, 40(1), 43. Tuncer, M., & Kaysi, F. (2013). The development of the Metacognitive Thinking Skills Scale. International Journal of Learning and Development, 3(2), 70. http://doi.org/10. 5296/ijld.v3i2.3449.

and the library. College & Research Libraries News, 73(5), 265–272. Schraw, G., & Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19(4), 460–475. http://doi.org/10.1006/ceps.1994.1033. Semerari, A., Cucchi, M., Dimaggio, G., Cavadini, D., Carcione, A., Battelli, V., ... Smeraldi, E. (2012). The development of the Metacognition Assessment Interview: Instrument description, factor structure and reliability in a non-clinical sample. Psychiatry Research, 200(2–3), 890–895. http://doi.org/10.1016/j.psychres.2012.07. 015. Tabatabai, D., & Shore, B. M. (2005). How experts and novices search the Web. Library

183