Assessing the competencies in general surgery residency training

Assessing the competencies in general surgery residency training

2004 APDS SPRING MEETING: PART 3 Assessing the Competencies in General Surgery Residency Training Cheryl I. Anderson, RN, Amy B. Jentz, MD, L. Rao Ka...

597KB Sizes 2 Downloads 83 Views

2004 APDS SPRING MEETING: PART 3

Assessing the Competencies in General Surgery Residency Training Cheryl I. Anderson, RN, Amy B. Jentz, MD, L. Rao Kareti, MD, James M. Harkema, MD, Keith N. Apelgren, MD, and Carol A. Slomski, MD Department of Surgery, Michigan State University, East Lansing, Michigan KEY WORDS: Surgery, residents, competencies, observations

BACKGROUND The ACGME endorsed 6 competencies in 1999 for residency training programs. The competencies include Patient Care, Medical Knowledge, Professionalism, Interpersonal and Communication Skills, Practice-Based Learning and Improvement, and Systems-Based Practice.1 The Michigan State University Integrated Residency Program in General Surgery implemented assessment of the competencies in September 2002. A department competency committee was established that included several senior faculty, residents, and quality improvement staff. Members were asked to (1) review the ACGME requirements, (2) assess current department evaluation methods, and (3) develop a plan to assess and develop a curriculum. Familiarity with the competencies occured with time. Resident evaluations were primarily concluded through an Internetbased program that assessed some components of the competencies. However, emphasis was placed on end-of-rotation summative reviews. Initial committee efforts focused on the practice-based learning (PBL) and systems-based practice (SBP) competencies.2-5 As one component of PBL, a resident must “apply knowledge of study designs and statistical methods to the appraisal of clinical studies and other information of diagnostic and therapeutic effectiveness.”1 To address this objective, an evaluation form was developed to assist in critically evaluating Journal Club articles. Participants score each article in 9 fields (ie, Statement of Hypothesis, Design, Statistical Analysis). Differences among reviewer scores are discussed. A second PBL objective requires the resident to “locate, appraise, and assimilate evidence from scientific studies related to their patients’ health problems.”1 This objective is addressed by (1) requiring presenters at Morbidity and Mortality conference to provide evidence-based data pertaining to their cases being discussed, and (2) having faculty

Correspondence: Inquiries to Cheryl I. Anderson, RN, Department of Surgery, Michigan State University, 1200 East Michigan Avenue, Lansing, MI 48912; fax: (517) 267-2488; e-mail: [email protected]

assign personal learning projects to residents, where resident knowledge deficits are identified. Residents are required to seek evidence from the literature or textbooks to support their answers. Results are placed in individual portfolios. The SBP competency requires residents to “demonstrate an awareness of and responsiveness to the larger context and system of health care and the ability to effectively call on system resources to provide care that is of optimal value.”1 Initial efforts were focused on identifying health systems and healthrelated professions that interface with resident patient care activities. For example, attorneys presented information on the importance of accurate and timely medical record documentation and the legal/economic implications when this does not occur. Surgical billing representatives discussed current procedural terminology (CPT) and Evaluation and Management codes and how improper coding can impact the financial health of an organization. Pretests and posttests were administered to assess learning. The SBP competency also requires that residents assist patients in dealing with system complexities and partner with health care managers. Residents on trauma rotations learn of system problems by leading a multidisciplinary trauma conference three times per week. Emphasis is placed on coordinating the physical, financial, emotional, and spiritual care of the patients and their families. Plans of care are developed that emphasize quality and value. Research projects are being initiated that investigate the cost effectiveness and clinical benefits of current practice patterns. For example, the efficacy of CT in the emergency department for patients exhibiting signs of appendicitis is being studied. Residents have assisted in development of a comprehensive discharge planning process that should eliminate patient telephone calls to the office. Rotations with Hospice care are being planned. Next, committee members discussed assessment of the patient care (PC) and medical knowledge (MK) competencies. The PC competency requires that residents “must be able to provide patient care that is compassionate, appropriate, and effective for the treatment of health problems and

CURRENT SURGERY • © 2005 by the Association of Program Directors in Surgery Published by Elsevier Inc.

0149-7944/05/$30.00 doi:10.1016/j.cursur.2004.07.016

111

TABLE 1. “Point of Observation” Assessment for Acute Appendicitis

112

CURRENT SURGERY • Volume 62/Number 1 • January/February 2005

FIGURE 2. 2003/2004 PGY I resident’s written pretest results. Percntages reflect acceptable or above expected knowledge levels as determined by reviewers.

FIGURE 1. Comparison of 2002/2003 PGY I resident’s written test results with retest (of same residents) as PGY IIs. Percentages reflect acceptable (0) or above expected (⫹1) knowledge level as determined by reviewers.

address a resident’s knowledge and technical skills related to their specialty. A literature review was conducted and efforts of other surgery training programs were reviewed.7-10 The ACGME “Suggested Best Methods for Evaluation” were studied.1 Test scores on the USMLE Step III, Absite examinations, and basic science and

the promotion of health.”1 The MK competency states that residents “must demonstrate knowledge about established and evolving biomedical, clinical, and cognate (e.g. epidemiological and social-behavioral) sciences and the application of this knowledge to patient care.”1 Both competencies

TABLE 2. Comparison of Appendicitis Pre-test Results as PGY I Residents to Re-test (of same residents) as PGY IIs, One Year Later RESIDENT A 13 possible Learning areas

Pathophysiology: Epidemiology: Early Presentation: Late Presentation: Differential Diagnosis: Preoperative Evaluation: Early Nonoperative Management: Late Nonoperative Management: Perforated Operative Management: Nonperforated Operative Management: Postoperative Management: Perforated Appendix Complications: Nonperforated Appendix Complications: INDIVIDUAL TOTALS: Overall Percent of Written Knowledge “” ⫽ Scores of “-1” Identified

Dec 02

Feb 04

RESIDENT B

RESIDENT C

RESIDENT D

Dec 02

Dec 02

Dec 02

Feb 04

Feb 04

  

Feb 04

Dec 02

Feb 04

0 2 1 1 3 1 1

0 0 0 0 0 2 0

 

1 1

0 0



1

1

 

2 2

1 0



2

0

18

4







  



□ □

  



1

0

6

2

1

1

10

1

92.3%

100.0%

53.8%

84.6%

92.3%

92.3%

23.1%

92.3%

1 7.7%

1 30.8%

CURRENT SURGERY • Volume 62/Number 1 • January/February 2005

UNCHANGED

PROGRAM

1 69.2%

65.4%

92.3%

1 26.9% 113

found that appendectomy, breast care procedures, cholecystectomy, and nonoperative trauma management were most frequently performed. A baseline test was developed and administered. PGY I residents (n ⫽ 4; approximately 5 months into their year; December 2002) were given a written medical knowledge test for each of the 4 areas listed above. The narrative responses were blinded and 3 or 4 faculty/chief residents were asked to rate interns in each section. In one example, the 13 categories assessed for appendectomy patient care included:

FIGURE 3. Appendicitis “Point of Observation” assessment results, comparing first and second quarters.

textbook reviews served as a baseline. Learning needs identified during mock orals and end-of-rotation summative reviews were considered. Following several months of discussion, a plan was formulated. The committee’s objectives were 2-fold. First, members wanted to develop an assessment method that would provide more frequent and precise evaluation at the time that the resident task or skill is being observed. Second, members wanted to provide appropriate faculty development opportunities to transition competency assessment into the curriculum. The process described below details these efforts. Data reflect the first year of implementation.

METHODS Initial efforts were focused on junior residents. The committee reviewed operative logs of first- and second-year residents and

FIGURE 4. Breast care “Point of Observation” assessment results, comparing the second quarter to 6 weeks of the third quarter. 114

• • • • • • • • • • • •

Pathophysiology Epidemiology Relate Early Presentation to Pathophysiology Relate Late Presentation to Pathophysiology Differential Diagnosis Preoperative Evaluation Early Nonoperative Management Late Nonoperative Management Nonperforated Operative Management Perforated Operative Management Postoperative Management Complications of Nonperforated Appendix/Treatment Options • Complication of Perforated Appendix/Treatment Options Reviewers recorded a “0” if the intern wrote at an expected entrance level, “⫹1” if above the level expected, and “⫺1” if below expectations. The same test was taken by the same residents the next year (February 2004). As outlined in the objectives, faculty agreed that feedback should be given at the “Point of Observation.” An evaluation form was developed for each diagnosis and arranged into 6 common categories, including (1) knowledge of disease process (anatomy, pathophysiology, etc), (2) history-taking skills, (3) physical examination, (4) preoperative preparation/decision making, (5) operative skills/intraoperative decision making, (dexterity, knot tying, suturing, wound closure, etc), and (6) postoperative or post trauma care. (See Example: Acute Appendicitis “Point of Observation” assessment form in Table 1.) A Likert scale, ranging from 0% to 100% (increments of 10) was selected as the method of measurement. First-year residents were expected to perform at a novice or advanced-beginner level (0% to 30%) at the start of their training, with progress demonstrated throughout the year.6 Senior residents would be expected to achieve scores of 90% to 100% in each category, although they were not evaluated in this study. Blank assessment forms were distributed to the operating room, emergency department, and surgeon offices. The process was enhanced by an aggressive campaign to distribute forms to individual residents and faculty as cases were scheduled. Faculty was asked to complete any or all of the criteria on the forms. Evaluators circled the corresponding percentage that best reflected the resident’s knowledge or skill level being observed. Responses were entered into a central database. CURRENT SURGERY • Volume 62/Number 1 • January/February 2005

RESULTS The 2002 PGY I resident scores compared to retest scores (of same residents) as PGY IIs are shown in Fig. 1. Percentages shown are averages of the group tested and reflect overall scores of acceptable (“0”) or above expected (“⫹1”) knowledge levels. On initial testing, residents demonstrated knowledge in 65.4% of the fields tested for appendicitis, 67.1% for breast care, 70.6% for cholecystitis, and 78.6% for nonoperative trauma. On repeat testing in February 2004, residents demonstrated improvement in knowledge in each area, except breast care. Table 2 further demonstrates analysis of these results, using appendicitis criteria as an example. Initial testing (December 2002) showed that most deficiencies in resident knowledge occurred when considering a differential diagnosis, epidemiology, and postoperative management. Individually, 3 of the 4 residents showed improvements when retested (February 2004). One resident’s scores remained the same. Overall scores improved by 26.9% when retested the next year, with improvements demonstrated in each of the initial deficient areas. However, different learning needs were identified in the area of preoperative evaluation. The 2003 PGY I resident scores on the same test, taken in June 2004, are found in Fig. 2. Average scores in this group of residents were significantly lower than the previously tested group. The breast care knowledge score was 12.8%, with deficiencies found in every category on the test. Appendicitis knowledge scores were also low at 29.7%, followed by nonoperative trauma (48.1%) and cholecystitis (62.7%). Point of Observation assessments gradually increased over the first 6 months (July to December 2004). Appendicitis assessments received during the first and second quarters (n ⫽ 12, n ⫽ 8) are compared in Fig. 3. Overall group scores showed improvement in 4 areas: knowledge, history taking, preoperative preparation, and intraoperative skills/decision making. Physical examination scores declined, primarily when residents considered a differential diagnosis. Insufficient numbers were received for comparison of postoperative scores. Four breast observation assessments were received during the first quarter (July to September 2004). Improvements between the 2nd quarter and 6 weeks of the 3rd quarter (n ⫽ 21, n ⫽ 14) are shown in Fig. 4. Each section showed improvements, ranging from 7% to 30%. Preoperative preparation improved by 29.2%, and operative skills/intraoperative decision making by 7.7%. Physical examination scores were low at 40% and 59.2% when retested. Data were insufficient for cholecystectomy and nonoperative trauma to make comparisons. Although PGY I and II residents were more frequently evaluated, assessments were received from all PGY levels. The written assessment forms proved to be cumbersome. An aggressive campaign to provide the appropriate form at the time of the case resulted in 100% returns in November 2003. However, December assessments were returned at a rate of 57.5%, January 2004 at 27.1%, and February at 25.5%. Despite conCURRENT SURGERY • Volume 62/Number 1 • January/February 2005

stant reminders and the wide distribution of forms, the return rate declined.

DISCUSSION Resident Competency assessment has been gradual, but continuous. A cultural change is being experienced as faculty shift from summative to formative evaluation methods. It is probably too early to see clear benefits, although the improvement in assessment scores is encouraging. The baseline assessment pretests were most beneficial in identifying programmatic priorities. Textbook reviews, grand round presentations, and journal club articles were selected to address identified learning needs. Individual learning needs varied with the 2002 residents scoring higher in each diagnosis than the 2003 residents. Testing in December may have contributed to the results. However, it was evident that both groups tested required additional training in breast care. As a result, junior residents were assigned to evaluate patients with breast problems, presenting in the office setting. Under close faculty supervision, residents conducted the initial evaluation and discussed diagnostic and treatment plans. Chiefs were apprised of junior resident learning needs, and operative assignments were made accordingly. Retest of this group will occur in June 2004. A more refined assessment tool is being considered for the 2004 interns. The “Point of Observation” assessments provided early recognition of individual learning needs. Residents with difficulty tying knots practiced on suture boards. Residents with difficulty reading mammograms were regularly assessed. Treatment options for breast cancer patients are being discussed with residents. A breast care curriculum is being developed. As the faculty and residents become more familiar with the forms, the residents anticipate and prepare. Personal learning projects become specific to the resident’s experiences. Implementation of generic assessment forms (1 for patient encounters and 1 for procedures) is also being considered. This would expand opportunities to evaluate all residents on a regular basis. Both faculty and residents support assessments occurring at the point of observation. However, delivery of the appropriate form and timely completion by faculty are problematic. Use of handhelds may enhance data collection. Efforts to build an interface between the personal digital assistant and a central database are being developed. Conducting monthly meetings with faculty and residents to discuss the competency activities enhanced faculty development. It was during these sessions that faculty presented their ideas and perceptions of the committee’s efforts. Feedback resulted in modifications to the process, and as the suggested changes were made, faculty participation increased.

CONCLUSIONS Implementation of competency assessment in surgery training is a gradual, but necessary process. Acceptance will require a 115

cultural change; compliance will be demonstrated in small increments. Formative evaluations introduced in this department have guided curriculum changes and have identified individual learning needs. Feedback at the “point of observation” has been regarded as valuable. Changing from a paper format to an electronic means may enhance the process. Improvements in resident knowledge and skill level are being documented. Finally, preliminary efforts have improved our assessment of junior residents and have assisted in identifying knowledge deficits that must be addressed in the curriculum and through improved individualized training.

4. Rating the Strength of Scientific Research Findings. Avail-

able at: http://www.ahrq.gov/clinic/epcsums/strenfact. htm. Accessed October 2, 2002. 5. Dunnington, GL, Williams, RG. Addressing the new

competencies for residents’ surgical training. Acad Med. 2003;78:14-21. 6. Dreyfus SE, Dreyfus HL. A Five-Stage Model of the Mental

Activities Involved in Directed Skill acquisition. Berkeley, CA: U.S. Department of Commerce; 1980. 7. Faulkner H, Regehr G, Martin J, Resnick R. Validation of

REFERENCES 1. ACGME Outcome Project. Available at: http://www.

acgme.org. Accessed September 24, 2002. 2. Centre for Health Evidence. Evidence-based medicine: a

new approach to teaching the practice of medicine based on users’ guides to evidence-based medicine. JAMA [serial online]. 1992;268:2420-2425. Available at: http://www. cche.net/usersguides/ebm.asp, Accessed January 31, 2003. 3. Illinois Masonic Medical Center Surgical Critical Care Res-

idency Program Fellow Handbook. 1996. Chicago, IL

116

an objective structured assessment of technical skill for surgical residents. Acad Med. 1996;71:1363-1365. 8. Holmboe E. Faculty and the observation of trainees’ clin-

ical skills: problems and opportunities. Acad Med. 2004; 79:16-22. 9. Brennan BG, Norman GR. Use of encounter cards for

evaluation of residents in obstetrics. Acad Med. 1997;72: 43-44. 10. The Royal College of Physicians and Surgeons of Cana-

da’s Canadian Medical Education Directions for Specialists 2000 Project. September 1996.

CURRENT SURGERY • Volume 62/Number 1 • January/February 2005