ORIGINAL REPORTS
Assessing Competency in Practice-Based Learning: A Foundation for Milestones in Learning Portfolio Entries Travis P. Webb, MD, MHPE* Taylor R. Merkley, BS* Thomas J. Wade, MD* Deborah Simpson, PhD* Rachel Yudkowsky, MD, MHPE† and Ilene Harris, PhD† Medical College of Wisconsin, Milwaukee, Wisconsin; and †University of Illinois-Chicago, Chicago, Illinois
*
BACKGROUND: Graduate medical education is under-
going a dramatic shift toward competency-based assessment of learners. Competency assessment requires clear definitions of competency and validated assessment methods. The purpose of this study is to identify criteria used by surgical educators to judge competence in Practice-Based Learning and Improvement (PBL&I) as demonstrated in learning portfolios. METHODS: A total of 6 surgical learning and instructional
portfolio entries served as documents to be assessed by 3 senior surgical educators. These faculty members were asked to rate and then identify criteria used to assess PBL&I competency. Individual interviews and group discussions were conducted, recorded, and transcribed to serve as the study dataset. Analysis was performed using qualitative methodology to identify themes for the purpose of defining competence in PBL&I. The assessment themes derived are presented with narrative examples to describe the progression of competency. RESULTS: The collaborative coding process resulted in
identification of 7 themes associated with competency in PBL&I related to surgical learning and instructional portfolio entries: (1) self-awareness regarding effect of actions; (2) identification and thorough description of learning goals; (3) cases used as catalyst for reflection; (4) reconceptualization with appropriate use and critique of cited literature; (5) communication skills/completeness of entry template; (6) description of future behavioral change; and (7) engagement in process—identifies as personally relevant. CONCLUSIONS: The identified themes are consistent
with and complement other criteria emerging from reflective practice literature and experiential learning theory. This study provides a foundation for further development of a Correspondence: Inquiries to Travis P. Webb, MD, MHPE, Medical College of Wisconsin, 9200 W Wisconsin Ave, Milwaukee, WI 53226; fax: (414) 805-8641; e-mail:
[email protected]
tool for assessing learner portfolios consistent with the Accreditation Council for Graduate Medical Education’s Next Accreditation System requirements. ( J Surg ]:]]]-]]]. C J 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.) KEY WORDS: practice-based learning, residency, mile-
stones, competency, learning portfolios COMPETENCIES: Practice-Based Learning and Improve-
ment, Medical Knowledge
BACKGROUND The culture of medical training has changed dramatically over the past several years as the focus on education outcomes has intensified. The Accreditation Council for Graduate Medical Education (ACGME) and the American Board of Medical Specialties formulated guidelines for instruction and assessment of 6 domains of resident competencies, outlined in the ACGME Outcomes Project1 and reinforced in the Next Accreditation System requirements.2 One of the more difficult competencies to define, develop, and assess is practice-based learning and improvement (PBL&I). PBL&I has been described in the ACGME Common Program Requirements3 as the ability of the resident to carry out the following: “identify strengths, deficiencies, and limits in one’s knowledge and expertise; set learning and improvement goals; identify and perform appropriate learning activities; systematically analyze practice using quality improvement methods; implement changes with the goal of practice improvement; incorporate formative evaluation feedback into daily practice;
Journal of Surgical Education & 2014 Association of Program Directors in Surgery. Published by 1931-7204/$30.00 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.jsurg.2014.01.019
1
locate, appraise, and assimilate evidence from scientific studies related to their patients’ health problems; use information technology to optimize learning; and participate in the education of patients, families, students, residents and other health professionals.”
Although the ACGME has formulated some recommendations for tools to assess PBL&I, residency programs have been charged with the ultimate responsibility of developing assessment plans. Moreover, the plan must use dependable measures to assess residents’ competence. No single tool is adequate to evaluate all aspects of trainee achievement of competency, but the learning portfolio has been advocated as a potential means to assess residents’ competencies in several areas and especially PBL&I.5 Several authors have described using learning portfolios to provide evidence of ongoing professional growth and development through selfdirected learning.6–10 Learning portfolios provide a vehicle for reflective practice, which has been identified as an essential component of professional development.11,12 However, a recent Best Evidence in Medical Education review of the use of portfolios for learning and assessment concludes that more research is necessary to demonstrate the value of portfolios as a learning and assessment tool in graduate medical education.13 Since the introduction of the defined competencies, the ACGME has recognized the slow implementation progress of the outcomes project. In an effort to encourage national development of outcomes assessment tools, the ACGME has shifted its attention from competency-based instruction to competency-based assessment through its milestones project and next accreditation system.2 The purpose of the milestones project is to identify specific achievement markers (i.e., milestones) for each of the 6 competencies. Several national specialty groups have been actively exploring the specific meaning of the competencies within their respective specialties and creating foundational work to better define and delineate achievement of the competencies.14,15 These groups sought to identify appropriate assessments to benchmark progress in the competencies with the expectation that milestone reporting would begin in July 2013.2 In line with the underlying concept of the milestones project and its corresponding focus on assessment tools, an
area in need of further study is whether competence in PBL&I can be demonstrated in learning portfolios. Learning portfolios likely contain evidence of achievement of multiple competencies, but PBL&I is the most consistently addressed competency within the portfolios.13 However, there are few published studies describing criteria that could be used to frame an evidence-based scoring rubric or tool for PBL&I within resident portfolios. Further work is necessary to define and delineate the competency-based expectations for portfolio-based assessment of resident trainees. For portfolios to achieve their fullest potential as a learning and assessment tool, objectives linked with clearly defined outcomes/milestones must be developed to provide the learner with direction for achieving those objectives and thus being deemed competent.16 Several studies have been reported targeted at PBL&I competency assessment as demonstrated by portfolios using rating instruments.17–20 However, PBL&I is a competency that is difficult to assess using traditional assessment instruments, and the results of these portfolio-rating instruments indicate validity and reliability difficulties. For example, Pitts et al. attempted to improve the reliability of their assessment through rater training, using a Likert-type scale scoring system that did not include behavioral anchors. Despite extensive training, results revealed only modest interrater reliability improvement (κ increased from 0.260.5).18 O’Sullivan et al.19 were unable to demonstrate validity evidence for their rating instrument; there was a poor correlation of portfolio scores and clinical rotation performance evaluations. The greatest limitation to the widespread dissemination of these rating instruments is the lack of a strong evidence-based foundation from which to set milestones marking the development of PBL&I competency through portfolio entries. In 2001, we implemented a learning portfolio program within our general surgery residency program. The surgical learning and instructional portfolio (SLIP) was designed to provide a structure for each resident to develop their own individual portfolio of cases, demonstrating evidence of selfdirected learning. The program has been previously described, including an initial evaluation providing key lessons learned.21 The basic SLIP component is a monthly case topic chosen by the resident and then reported using a template that includes case history, supporting diagnostic studies, differential diagnosis, final diagnosis with the International Classification of Diseases, Ninth Revision coding, management options, treatment used, 3 lessons learned, further elaboration and discussion of one of the lessons, and 2 articles related to the lessons.21 This template supports and guides the learner through a self-directed learning plan for each case topic. Though the SLIP portfolio has been mandated and used within our surgical program since 2001, formal or summative assessment of residents’ competence in PBL&I as demonstrated by SLIP entries has not been implemented. The SLIP template provides a
2
Journal of Surgical Education Volume ]/Number ] ] 2014
The steps to be taken by the resident engaging in PBL&I, according to ACGME guidelines, include the following: Monitoring one’s practice Reflecting on one’s practice to identify areas in need of improvement Engaging in learning activities Applying new knowledge into practice Monitoring the effect of new knowledge and practice on outcomes4
structure allowing the resident to demonstrate ACGMEdefined PBL&I skills including identify strengths and deficiencies in knowledge and skill; set learning goals; and locate, appraise, and assimilate evidence from the literature related to patient problems. However, a previous attempt to assess SLIP entries using a rating tool developed before identifying the foundational elements associated with competent PBL&I performance proved too unreliable for formal assessment.22 The purpose of this study is to provide the field of surgical education with the foundational evidence needed to support development of a PBL&I assessment instrument, using expert faculty judgment of competency-based criteria, in the context of SLIP-formatted portfolio entries.
METHOD OF INQUIRY A qualitative approach, using expert informants, was used to identify foundational elements associated with the assessment of competency in PBL&I in the context of portfolio entries using the SLIP format. A total of 6 SLIP entries, written by residents within the general surgery residency at the Medical College of Wisconsin during the 2009 academic year, served as documents to be analyzed by 3 senior surgical educator faculty members in the Department. The number of SLIP entries (6) chosen to be analyzed was based on the principal investigator’s (T.W.) 5 years of experience reviewing SLIP entries,22,23 considering the amount of time taken to read and reflect on SLIP entries with the goal of achieving theme saturation. The 6 SLIP entries were chosen by the principal investigator and vetted by an independent study assistant (T.M.) for their perceived variability in content, length, and overall quality. The entries were chosen from a pool of 420 entries from the 2009 academic year that were deidentified except for the authoring resident’s training year. Overall, 2 SLIP study documents were chosen from each of the postgraduate year 1 (PGY1), PGY3, and PGY5 years to provide potential variability based on resident experience over the continuum of the 5-year residency program. Following institutional review board study review/approval, 3 senior faculty members in the department of surgery, with extensive experience in resident education, local recognition as members of the Society of Teaching Scholars at the Medical College of Wisconsin, and national recognition as “Teacher of the Year” by the Association for Surgical Education, served as the expert faculty raters. Each of the faculty has had rich experience in assessment and served on national committees related to education and competency assessment. The 3 reviewers had variable experience with the SLIP program and with reading individual SLIP entries. One faculty member is currently serving as the SLIP facilitator by reading each entry and providing feedback to the residents. The other 2 faculty members had previously viewed SLIP entries, but neither of them read them on a routine basis. This intentional Journal of Surgical Education Volume ]/Number ] ] 2014
variability is consistent with qualitative methods practice as it contributes to the richness of the rating sample. Each rater was asked to individually read the documents, rate the resident’s skills in PBL&I based on the documents, and provide a “think-out-loud” rationale for their rating of the resident’s skills. More specifically, each rater independently read the entire group of 6 documents and then went back through each one to provide their ratings and rationales. The raters’ process of assessment began by giving each portfolio a global rating on a 5-point scale ranging from poor to outstanding, with no further behavioral anchors to provide an initial framing of their opinions. Next, they provided rationales for their global ratings of each document that was recorded and transcribed. The principal investigator then conducted an interview with each rater, using the rater’s rating rationale as a basis for discussion, to further probe their thinking and rationales for their ratings. The interviews were conducted following a guideline for questioning, but were not tightly scripted, to allow for a more natural flow of conversation and thought. These interviews, like the rating rationales, were recorded and transcribed for qualitative analysis. Following the discussion of their individual rationales, the raters were convened to discuss their rationales and provide insight into these analyses within a group setting. Raters were asked to discuss the reasons for their SLIP ratings and to identify the critical aspects within the entries that demonstrated PBL&I competence. The group discussion was used to garner further data, as raters were likely to support and challenge each other’s perspectives.24,25 The group discussion was recorded, transcribed, and then analyzed to identify themes through a process of qualitative analysis by the principal investigator and 2 other independent reviewers (T.M. and T.W.) with experience and training in qualitative methods. The blinded raters identified themes and subthemes using the constant comparative method associated with grounded theory.26,27 Each theme was coded and after all discussions were analyzed, the raters met and conferred to compare themes. Consensus was developed through an iterative process of reanalysis of all comments and coding. Interpretations and conclusions were drawn from the group and individual qualitative analyses. Further verification of the trustworthiness of the themes was based on final intercoder consensus on themes, expert faculty rater member checking of the identified themes, and triangulation of the findings with the PBL&I literature. The data derived from the qualitative analysis are presented, with themes and examples, to depict the progression of competency in demonstrating PBL&I, using behavioral anchors to ground the descriptions.
RESULTS The collaborative coding process resulted in theme saturation after the identification of 7 themes, with comments 3
categorized in each, mentioned by each of the raters: (1) case used as catalyst for reflection; (2) engagement in process— identifies as personally relevant; (3) identification and thorough description of learning goals; (4) self-awareness regarding effect of knowledge and decision making on outcomes; (5) reconceptualization with appropriate use and critique of cited literature; (6) description of future behavioral change; and (7) communication skills/ completeness of entry template. The Table demonstrates the themes and subthemes developed and agreed on by the qualitative analysis reviewers listed in order of decreasing frequency. An explanatory model was created from the identified themes and subthemes to illustrate the chronology of
residents’ self-reflection when completing the SLIP entry. Following is a description of each of the themes using this process model construct (Fig.). Case Used as Catalyst for Reflection The first step in self-reflection requires an event or interaction on which to reflect. The SLIP assignment asks the learner to use a clinical case as the nidus for reflection. The raters’ comments reveal a desire to see a clear linkage between the case described and the subsequent topic described by the learner. They felt that the foundation of the SLIP entry should be the case as experienced by the learner. The learner should then indicate, through their
TABLE. Theme Coding: Themes and Subthemes Listed in Order of Decreasing Frequency Self-awareness regarding effect of knowledge and decision making on outcomes Demonstrates critical thinking by asking questions and questioning previous assumptions and beliefs considering alternative options Demonstrates self-reflection—critiques one’s own actions and decision making Shows insight by acknowledging deficits in knowledge or acknowledging veracity or falsehoods of previously held beliefs Demonstrates willingness to learn from case by comparing previous experience or assumptions to newly gained knowledge Reflects on management of case by comparing what was done and what could or should have been done Reflects on problem or learning point in context of experience and literature by describing what others have written or said about the topic Identification and thorough description of learning goals Identifies learning topic and key points in a clear, straightforward manner Describes learning objectives that are of the appropriate complexity and context, given the format of the SLIP entry and case described Describes lessons within discussion at appropriate depth—discussion includes evidence of knowledge and recites evidence or experience commensurate with complexity of decision making Describes management options based on new knowledge gained through research and reflection of learning topic Formulates learning objectives—learning objectives are present Describes new knowledge—explains what knowledge was gained through research on the learning topic Case used as catalyst for reflection Describes learning, changes in attitude, or approach by clearly stating what was garnered from the case and subsequent reexamination Describes competencies addressed by reflecting on case Refers to and places discussion in context of current experience Links learning objectives to case Links discussion of learning points to case by referring to case experience in relation to newly gained knowledge Reconceptualization with appropriate use and critique of cited literature References critiqued and analyzed in relation to case presentation References cited appropriately within learning topic discussion Reference quality and depth as demonstrated by up-to-date publication, study design, quality of journal, or complexity of review Reference content relevant to case and learning points identified At least 2 references identified—more references given the complexity or controversial nature of the learning topic Communication skills/completeness of entry template Correct spelling and grammar Easy to read—narrative flow is easy to follow Complete and concise case description—pertinent findings were included and nonpertinent facts were excluded Follows and completes SLIP template Description of future behavioral change Describes the effect of learning experience on future practice Incorporates data from references into future practice and management decisions Engagement in process—identifies as personally relevant Evidence of effort—shortcuts taken: simple explanation without elaboration, sloppy grammar, inappropriate use of abbreviations, or simple spelling errors Appears to have been cut and pasted or copied word for word from textbook or journal (too detailed or expansive discussion)
4
Journal of Surgical Education Volume ]/Number ] ] 2014
FIGURE. Model of resident reflective process using the SLIP.
SLIP entry, what was learned from the case, and on reflection, what still needs to be learned. The learning objectives formulated should clearly relate back to the case described. Therefore, the experience of the learner, as demonstrated by the case description, becomes the starting point and catalyst for further investigation, contemplation, and learning. Engagement in Process—Identifies as Personally Relevant As the process of self-reflection begins and the creation of a SLIP entry ensues, reviewers felt that buy-in and engagement in the task is necessary for a high-quality product. This evidence of engagement was difficult for the raters to define, but it was deemed important nevertheless. The raters described lower-quality entries as those that “smack of cut and paste.” The presence of “recanted data from a textbook” seemed to define those entries in which the authors had not truly engaged in the learning process but simply recited or copied a textbook or article. Overlapping with the communication skills theme, evidence of lack of engagement was also thought to be identifiable when shortcuts were taken, Journal of Surgical Education Volume ]/Number ] ] 2014
including the presence of a difficult-to-follow narrative or poor use of spelling and grammar. Identification and Thorough Description of Learning Goals The next step in the creation of SLIP is the synthesis of ideas regarding the learning goals. The raters felt that the learner should clearly identify and reflect on the learning goals. Learning goals should be “clear, thoroughly described, and appropriate for the case.” The depth of discussion related to the learning goal was described as the most important criterion for this theme. Raters felt that the learning goals should be discussed in a sophisticated and complete manner with clear conclusions drawn in the discussion of the topic. Self-awareness Regarding the Effect of Knowledge and Decision Making on Outcomes Resident self-awareness of the effect of their clinical case management and how their knowledge may have positively 5
altered plans and patient outcomes was the most frequently cited theme identified in our analysis. The comments categorized in this theme focus on the concept that the resident has identified a clinical problem and associated knowledge deficit. Moreover, they have attempted to analyze the problem and knowledge deficit through reflection and identification of new information and data in the literature using the SLIP writing process. Clear evidence of reflection, insight into the case, and most importantly, the knowledge gained from the experience, was felt to demonstrate a high-quality entry. The raters frequently described “critical thinking” or “shows insight,” related to the case, as important indicators of mastery of the PBL&I process. Reflection was described as “critiquing actions and decision making.”
level of understanding and comfort with the subject matter. Simple grammar and spelling errors were cited most frequently as a subtheme, followed closely by narrative flow and ease of reading. The ability of the resident to write a pertinent case description in a manner that tells a story or “paints a picture” was also identified as an important communication skill. Finally, within this theme, all the raters identified the ability to follow and complete the template as a significant factor in the identification of an excellent SLIP. Member checking was performed to evaluate the trustworthiness of the qualitative analysis. The raters reviewed the themes and all felt that the themes accurately reflected their interviews and attitudes regarding their own analyses of the SLIP entries.
Reconceptualization with Appropriate Use and Critique of Cited References
DISCUSSION
Appropriate communication skills, including proper grammar, spelling, and narrative, was identified as an important criterion for assessment of the SLIP entry. The raters felt that a concise, well-written SLIP demonstrates not only a high level of mastery of “proper English” but also a higher
In this study, we identified, based on raters’ reviews of SLIP entries, 7 distinct themes for criteria that our expert raters felt to be important indicators of quality of a SLIP entry and the associated ability of a learner to demonstrate PBL&I within the context of a learning portfolio. The results of this study can be analyzed within the context of the education literature to provide further evidence of the findings’ trustworthiness and this study’s contributions to the literature. Triangulation with previous studies and learning theory confirms the trustworthiness of our study results. The themes identified in this study are consistent with theories of reflective practice and experiential learning. Experiential learning theories, represented by Dewey,28 Lewin, and Piaget, have described learning as being most effective when it is closely associated with.fic experiences and takes place within a relevant context to the learner.29,30 Kolb’s31 experiential learning theory is cyclical and the following 4 stages are congruent with the chronological sequence of learner activity formulated in this study: the learner must describe a relevant case (concrete experience), use the case as a catalyst (reflective observation) for reconceptualization using available literature (abstract conceptualization), and then describe the effect on future behavior and actions (active experimentation). Furthermore, Dewey28 described the first step of the learning process as identifying a problematic or troublesome experience, corresponding to our theme of identification of a problem. Clearly, our experts felt that identification of a problem, and deliberately stating the purpose of the entry, was a key component of competency, as they often commented on the importance of self-awareness regarding the effect of actions and identification and thorough description of the learning topic. One of the struggles educators face when trying to analyze learners’ ability to perform PBL&I is identification of
6
Journal of Surgical Education Volume ]/Number ] ] 2014
Many comments by raters reflected the importance of reexamining the clinical experience by reviewing, analyzing, and critiquing pertinent literature. The identification of relevant and high-quality studies related to the learning goals was viewed as an important indicator of PBL&I. The raters believed that the learner should ideally demonstrate their ability to recognize whether data in the literature are applicable to their own cases and situations. Reconceptualizing the learner’s experience by using the literature as a guide to develop a new understanding was a frequent comment by the raters. Furthermore, the raters commented that a “thorough critique” of the scientific study methods, results, and conclusions was also necessary to demonstrate evidence of PBL&I. The number, quality, and content relevance of the literature cited were described as important indicators of PBL&I. Description of Future Behavioral Change Another frequent SLIP characteristic felt to be important by the raters was that the learner described the specific effect of learning from the case on future practice and decisions. The raters felt that for a SLIP to demonstrate PBL&I, the learner should put the learning experience within the context of their practice and ultimately draw conclusions about how the learning experience would “impact their future activity” and decisions when confronted with similar situations. Communication Skills/Completeness of Entry Template
observable actions and behaviors as a basis for assessment. Reflective learning has been described by Eva and Regehr32 as indicating a “conscious and deliberate reinvestment of mental energy aimed at exploring and elaborating one’s understanding of the problem one has faced (or is facing) rather than aimed simply at trying to solve it.” Understanding is not an observable action, making it impossible to accurately assess one’s ability to perform reflective learning within Eva’s definition. However, our raters identified observable behaviors, consistent with reflective practice, such as asking questions about what was done and then re-examining the situation, using data to support or refute what was done. This type of analysis of one’s practice and outcomes is also in line with the “5 A’s” of practicing evidence-based medicine: Assess, Ask, Acquire, Appraise, and Apply.33 Finally, these same principles are becoming important, as certifying bodies institute maintenance of certification requirements to participate in lifelong learning and self-assessment.34 There are a number of limitations to the general applicability of the study. First, the study elicited and analyzed the views of a small number of well-respected surgical educators regarding a specific type of learning portfolio. The learning portfolio entries were derived from a single academic institution and from a single group of surgical residents. There is potential for a lack of diversity in the SLIP entries based on the homogeneity of their authors. Finally, there may be unintended biases introduced by the principle investigator when he chose the SLIP entries to be reviewed by the raters. Despite these limitations, this study provides a unique foundation of expert opinion on which an assessment tool can be developed. Rigorously identifying the characteristics of varying levels of competence in PBL&I demonstrated in the SLIP entries makes it possible to create an appropriately grounded assessment tool. Future work will now include creating an assessment tool and evaluating the validity and reliability of that assessment tool.
4. Lynch DC, Swing SR, Horowitz SD, Holt K, Messer
JV. Assessing practice-based learning and improvement. Teach Learn Med. 2004;16(1):85-92. 5. Fragneto RY, Dilorenzo AN, Schell RM, Bowe EA.
Evaluating practice-based learning and improvement: efforts to improve acceptance of portfolios. J Grad Med Educ. 2010;2(4):638-643. 6. Anderson C. ACGME outcome project RSVP: imple-
menting a reflective portfolio to improve self evaluation and to address the competencies. Available at: 〈http:// www.acgme.org/outcome/implement/rsvpTemplate.asp? rsvpID=59〉; Accessed December 1, 2008. 7. Rees CE, Sheard CE. The reliability of assessment
criteria for undergraduate medical students’ communication skills portfolios: the Nottingham experience. Med Educ. 2004;38(2):138-144. 8. Carraccio C, Englander R. Evaluating competence
using a portfolio: a literature review and web-based application to the ACGME competencies. Teach Learn Med. 2004;16(4):381-387. 9. Fung MFKF, Walker M, Fung KFK, Temple L,
Lahoie F, Bellemare G, et al. An Internet-based learning portfolio in resident education: the KOALATM multicentre programme. Med Educ. 2000; 34(6):474-479. 10. Rosenberg ME, Watson K, Paul J, Miller W, Harris I,
Valdivia TD. Development and implementation of a web-based evaluation system for an internal medicine residency program. Acad Med. 2001;76(1):92-95. 11. Snadden D, Thomas M. The use of portfolio learning
in medical (3):192-200.
education.
Med
Teach.
1998;20
12. Parboosingh J. Learning portfolios: potential to assist
health professional with self-directed learning. J Contin Educ Health Prof. 1996;16(2):75-81.
REFERENCES 1. Accreditation Council for Graduate Medical Educa-
tion. ACGME outcome project. Available at: 〈http:// www.acgme.org/outcome/instrmod/instrMod_port_4. asp〉; Accessed December 1, 2008.
2. Nasca TJ, Philibert I, Brigham T, Flynn T. The next
GME accreditation system—rationale and benefits. N Engl J Med. 2012;366(11):1051-1056. 3. Accreditation Council for Graduate Medical Educa-
tion. ACGME program director guide to the common program requirements. Available at: 〈http://www. acgme.org/acWebsite/navPages/commonpr_documents/ IVA5c_EducationalProgram_ACGMECompetencies_P BLI_Explanation.pdf〉; Accessed March 25, 2012. Journal of Surgical Education Volume ]/Number ] ] 2014
13. Tochel C, Haig A, Hesketh A, et al. The effectiveness of
portfolios for post-graduate assessment and education: BEME guide no 12. Med Teach. 2009;31(4):299-318. 14. Green ML, Aagaard EM, et al. Charting the road to
competence: developmental milestones for internal medicine residency training. J Grad Med Educ. 2009;1:5-20. 15. Hicks PJ, Schumacher DJ, Benson BJ, Burke AE,
Englander R, Guralnick S, et al. The pediatric milestones: conceptual framework, guiding principles, and approach to development. J Grad Med Educ. 2010; 2(3):410-418. 16. Shumway JM, Harden RM. AMEE guide no. 25: the
assessment of learning outcomes for the competent 7
and reflective physician. Med Teach. 2003;25(6): 569-584.
Focus Group Research: Politics, Theory, and Practice. London: Sage Publications, 1999. p. 47-63.
17. Pitts J, Coles C, Thomas P. Educational portfolios in
26. Glazer BG, Strauss AL. The Discovery of Grounded
the assessment of general practice trainers: reliability of assessors. Med Teach. 1999;33(7):515-520. 18. Pitts J, Coles C, Thomas P. Enhancing reliability in
the portfolio assessment: “shaping” the portfolio. Med Teach. 2001;23(4):351-355.
19. O’Sullivan PS, Reckase MD, McClain T, Savidge MA,
Clardy JA. Demonstration of portfolios to assess competency of residents. Adv Health Sci Educ Theory Pract. 2004;9(4):309-323. 20. Melville C, Rees M, Brookfield D, Anderson J.
Portfolios for assessment of pediatric specialist registrars. Med Educ. 2004;38(10):1117-1125. 21. Webb TP, Aprahamian C, Weigelt JA, Brasel KJ. The
surgical and instructional portfolio (SLIP) as a selfassessment educational tool demonstrating practicebased learning. Curr Surg. 2006;63(6):444-447. 22. Webb TP, Merkley TR. An evaluation of the success
of a surgical resident learning portfolio. J Surg Educ. 2012;69(1):1-7. 23. Webb TP, Merkley TR. The learning portfolio: what
residents are learning. J Grad Med Educ. 2011;3 (1):104-108. 24. Stringer E. Action Research in Education. Upper
Saddle River, NJ: Pearson Education; 2004.
Theory. New York, NY: Aldine; 1999. 27. Harris I. What does “The discovery of grounded
theory” have to say to medical education? Adv Health Sci Educ Theory Pract. 2003;8(1):49-61. 28. Dewey J. How We Think: A Restatement of the
Relation of Reflective Thinking to the Educative Process. Boston, MA: D.C. Heath; 1933. 29. Osterman KF, Kottkamp RB. Reflective Practice for
Educators: Improving Schooling Through Professional Development. Newbury Park, CA: Corwin Press; 1993. 30. Schon DA. Educating the Reflective Practitioner:
Toward a New Design for Teaching and Learning in the Professions. San Francisco, CA: Jossey-Bass; 1987. 31. Kolb DA. Experiential Learning: Experience as the
Source of Learning and Development. New Jersey: Prentice-Hall; 1984. 32. Eva KW, Regehr G. “I’ll never play professional
football” and other fallacies of self-assessment. J Contin Educ Health Prof. 2008;28(1):14-19. 33. Sackett DL, Richardson WS, Rosenberg WMC, Hay-
nes RB. Evidence-Based Medicine: How to Practice and Teach EBM. London, UK: Churchill Livingstone; 1997.
25. Farquhar C. Are focus groups suitable for ‘sensitive’
34. Levinson W, Holmboe E. Maintenance of certifica-
topics? In: Barbour R, Kitzinger J, eds. Developing
tion: 20 years later. Am J Med. 2011;124(2):180-185.
8
Journal of Surgical Education Volume ]/Number ] ] 2014