Standardized Direct Observation Assessment Tool: Using a Training Video

Standardized Direct Observation Assessment Tool: Using a Training Video

The Journal of Emergency Medicine, Vol. -, No. -, pp. 1–8, 2016 Ó 2016 Elsevier Inc. All rights reserved. 0736-4679/$ - see front matter http://dx.do...

177KB Sizes 0 Downloads 77 Views

The Journal of Emergency Medicine, Vol. -, No. -, pp. 1–8, 2016 Ó 2016 Elsevier Inc. All rights reserved. 0736-4679/$ - see front matter

http://dx.doi.org/10.1016/j.jemermed.2016.12.002

Education STANDARDIZED DIRECT OBSERVATION ASSESSMENT TOOL: USING A TRAINING VIDEO Kathleen E. Kane, MD, Kevin R. Weaver, DO, Gavin C. Barr Jr., MD, Gary Bonfante, DO, Nicole L. Bendock, DO, Brian M. Berry, DO, Stephanie L. Goren-Garcia, DO, Marc B. Lewbart, DO, Allison L. Raines, DO, Gregory Smeriglio Jr., DO, and Bryan G. Kane, MD Department of Emergency Medicine, Lehigh Valley Hospital and Health Network/USF MCOM, Allentown, Pennsylvania Reprint Address: Kathleen E. Kane, MD, Lehigh Valley Hospital and Health Network, 5th Floor, Emergency Medicine Residency Suite, 2545 Schoenersville Road, Bethlehem, PA 18107

, Abstract—Background: We developed a DVD training tool to educate physicians evaluating emergency residents on accurate Standardized Direct Observation Assessment Tool (SDOT) application. Objective: Our goal was to assess whether this training video improved attendings’ and senior residents’ SDOT use. Methods: Participants voluntarily completed SDOT evaluations based on a scripted ‘‘test’’ video. A DVD with ‘‘positive’’ and ‘‘negative’’ scenarios of proper SDOT use was viewed. It included education on appropriate recording of 26 behaviors. The test scenario was viewed again and follow-up SDOTs submitted. Performances by attendings and residents on the pre- and posttest SDOTs were compared. Results: Twenty-six attendings and 26 senior residents participated. Prior SDOT experience was noted for 8 attendings and 11 residents. For 20 anchors, participants recorded observed behaviors with statistically significant difference on one each of the pretest (no. 20; p = 0.034) and post-test (no. 14; p = 0.041) SDOTs. On global competency assessments, pretest medical knowledge

(p = 0.016) differed significantly between groups. The training intervention changed one anchor (no. 5; p = 0.035) and one global assessment (systems-based practice; p = 0.031) more negatively for residents. Recording SDOTs with exact agreement occurred 48.73% for attendings pretest and 54.41% post-test; resident scores were 45.86% and 49.55%, respectively. DVD exposure slightly raised attending scores (p = 0.289) and significantly lowered resident scores (p = 0.046). Conclusions: Exposure to an independently developed SDOT training video tended to raise attending scores, though without significance, while at the same time lowered senior resident scores statistically significantly. Emergency attendings’ and senior residents’ SDOT scoring rarely differed with significance; about half of anchor behaviors were recorded with exact agreement. This suggests senior residents, with appropriate education, may participate in SDOT assessment. Ó 2016 Elsevier Inc. All rights reserved. , Keywords—SDOT; training video

The study was reviewed and approved as exempt by our network’s Institutional Review Board before any study procedures taking place. An abstract of this report was presented in poster format at the Council of Emergency Medicine Residency Directors Academic Assembly in 2012 in Atlanta, GA. The project was represented in an oral format at the 2012 Pennsylvania College of Emergency Physicians Scientific Assembly in Gettysburg, PA, where it received a Spivey Research Award.

INTRODUCTION The Accreditation Council for Graduate Medical Education (ACGME) requires performance evaluation of all residents. Previously, the evaluation process was designated to address six areas referred to as the ‘‘Core

RECEIVED: 3 December 2015; FINAL SUBMISSION RECEIVED: 27 July 2016; ACCEPTED: 2 December 2016 1

2

Competencies,’’ which were central to the prior ACGME evaluation system (1). These competencies consist of the following: patient care (PC), medical knowledge (MK), practice-based learning and improvement (PBL), interpersonal and communication skills (ICS), professionalism (PROF), and systems-based practice (SBP) (1). The Standardized Direct Observation Tool (SDOT) was developed with these competencies in mind (2). The initial validation was with video scenarios, and the use of senior residents as evaluators was not studied in this cohort. Recently, the American Board of Emergency Medicine and the ACGME have moved to the evaluation of emergency residents using the Next Accreditation System (NAS) ‘‘milestones.’’ Table 1 describes these milestones. The Council on Residency Directors for Emergency Medicine (CORD) recommends direct observation as an evaluation tool for emergency residents for the following milestones: 1, 2, 3, 4, 5, 6, 7, 8, 10 (via checklist), 12, 13, 16, 17, 18, 19, 20, 21, 22, and 23 (3). Direct observation of some type is recommended in 19 of the 23 milestones. The previous SDOT is specifically recommended for measurement in all but milestone 13, 18, and 23. The SDOT requires an emergency physician observe all aspects of a residentpatient encounter from initial contact to history and physical examination, reevaluation, and final disposition. The tool has standard definitions related to performance for each of the 26 evaluated areas, the six core competencies, a global assessment, and the ability to provide free-form feedback (4). These performance assessment definitions, or ‘‘anchors,’’ as we denote in this article, are translatable to measureable NAS behaviors. A previous study demonstrated that the SDOT instrument can be used with a high degree of inter-rater reliability by attendings in a summative fashion for five of the six competencies, with minimal training (5). The use of senior residents as SDOT evaluators has not, to date, been studied. We sought to develop a training tool that educated both attendings and senior residents on the application of the SDOT instrument, while viewing a resident during a performance encounter modeling positive behaviors (exceeds expectations), negative behaviors (below expectations), and mixed behaviors (combination of exceeds, meets, and below expectations). Accuracy would be defined as recording, on a study SDOT, behavior with exact agreement from a video encounter. The behaviors would be scripted with the tool’s definitions (4). This study’s specific goal was to determine whether this brief training video could improve attendings’, as well as postgraduate year (PGY) 3 and PGY4 residents’ SDOT use, as measured by exact agreement.

K. E. Kane et al. Table 1. Emergency Medicine Milestones and Grading Rubric No.

Category

1 2

PC1 PC2

3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19

PC3 PC4 PC5 PC6 PC7 PC8 PC9 PC10 PC11 PC12 PC13 PC14 MK SBP1 SBP2 SBP3 PBL1

20 21 22 23

PROF1 PROF2 ICS1 ICS2

Milestone Emergency Stabilization Performance of a Focused History and Physical Examination Diagnostic Studies Diagnosis Pharmacotherapy Observation and Reassessment Disposition Multi-tasking General Approach to Procedures Airway Management Anesthesia and Acute Pain Management Goal-directed Focused Ultrasound Wound Management Vascular Access Medical Knowledge Patient Safety Systems-based Management Technology Practice-based Performance Improvement Professional Values Accountability Patient Centered Communication Team Management

Grading rubic (applies independently to each milestone) Level 1: The resident demonstrates milestones expected of an incoming resident. Level 2: The resident is advancing and demonstrates additional milestones, but is not yet performing at a mid-residency level. Level 3: The resident continues to advance and demonstrate additional milestones; the resident demonstrates the majority of milestones targeted for residency in this sub-competency. Level 4: The resident has advanced so that he or she now substantially demonstrates the milestones targeted for residency. This level is designed as the graduation target. Level 5: The resident has advanced beyond performance targets set for residency and is demonstrating ‘‘aspirational’’ goals, which might describe the performance of someone who has been in practice for several years. It is expected only a few exceptional residents will reach this level. ICS = interpersonal and communication skills; MK = medical knowledge; PBL = practice-based learning and improvement; P = patient care; PROF = professionalism; SBP = systems-based practice.

MATERIALS AND METHODS This was a pre-/post-education intervention study using the standards for performance evaluations as defined by the ACGME and CORD for performing SDOTs (4). The study was reviewed and approved as exempt by our network’s Institutional Review Board before any study procedures taking place. The study was conducted at an independent academic medical center, not affiliated with a local medical school, hosting a dually approved PGY1 through PGY4 emergency medicine (EM) residency hosting 14 residents per year. Employed at the

SDOT: Using a Training Video

time of the study were 46 attendings, 19 of whom were designated as teaching faculty. Because all attendings evaluate residents, no differentiation was made as to whether they were designated as teaching faculty on the Program Information Form (PIF). All emergency attending physicians and PGY3/PGY4 residents were given the material for improving faculty and senior resident knowledge. All were deemed eligible, and were asked to participate in this study. Core faculty who were members of the study team were excluded. Each of the attendings and the PGY3/PGY4 residents were asked to watch ‘‘test’’ and ‘‘educational’’ DVDs, which took approximately 70 min. EM core faculty and senior residents developed the DVD. Ideal scores for the test case were determined by design and intentionally modeled in the video seen on the DVDs. The research team provided the DVDs to all potential participants. Consent to participate was implied based on the return of evaluation forms; there was no penalty for those who chose not to participate. Study participants viewed two DVDs of a residentpatient encounter. The first DVD, considered the pretest DVD, incorporated mixed behaviors (a combination of exceeds, meets, and below expectations behaviors) in a resident performance surrounding a single mock residentpatient encounter and residentattending interaction. The participants viewed the pretest DVD and evaluated this scenario by completing the SDOT evaluation tool before receiving any education. The second, educational, DVD was given to study participants to view in its entirety after they submitted their initial completed SDOT evaluation form. This educational DVD was developed as the primary training tool that modeled positive behaviors (exceeds expectations), negative behaviors (below expectations), and mixed behaviors (combination of exceeds, meets, and below expectations) of resident performances surrounding three versions of a mock single-patient encounter highlighting a chief complaint of chest pain. The first two encounters on the educational DVD were informative and demonstrated a scripted positive and then a scripted negative resident performance. After viewing each of these scripted scenarios, the participants were asked to complete practice SDOT evaluation forms, but were not required to return this form to the researchers, as the DVD provided immediate feedback and ideal answers for educational purposes. After the third encounter—a repeat of the scenario on the first DVD that contained a combination of predetermined mixed behaviors—another SDOT evaluation form was completed. This final SDOT evaluation was submitted to the researchers as posteducation data and compared to the pre-education SDOT data provided before viewing the educational video.

3

All details of the mock patient’s case and the roles actors played were consistent in the scripted positive and negative cases. The elements that changed were related only to the resident actor’s performance, that is, whether it was overwhelmingly positive or negative. For the third encounter, in both the pre- and post-test scenario, details of the patient’s history and examination remained the same, except instead of needing to translate for the patient, the patient was hearing impaired. Here the resident actor’s performance was intentionally mixed. The testcase role actors were changed to prevent any affective bias from the first two cases. The intended scripted resident performance is demonstrated in Table 2. The 1 through 5 Likert global assessment scales for the six core competencies—PC, MK, PBL, ICS, PROF and SBP—were neither scripted nor taught in the educational video and, as such, interpretation was left to the study participant. Also unscripted was the overall clinical competence score. Pre- and post-education SDOT evaluation forms were matched by name, by the Emergency Medicine Research Office. The study’s research coordinator deidentified these forms, ultimately using a study identification number. Names were never released to study team members or the statistician. As part of the consent process, participants were informed only the research coordinator was aware of their identity. At the conclusion of the final SDOT scenario, study participants were asked to complete a demographic form collecting general demographic information and written comments regarding the viewed encounter. Participant names were later removed and replaced by their study identification number by the study’s research coordinator. Analysis included descriptive statistics (mean 6 standard deviation) for continuous variables and n (%) for categorical variables for each of the evaluation cycles. The primary outcome was comparison of individual participant pre- and post-test scores. Secondary outcomes included comparisons of faculty and resident scores. Student’s t-test, Pearson c2 (Fisher’s exact test), Mann-Whitney U tests, and McNemar tests were used, as appropriate. These two test scenario scores were compared using Wilcoxon two- and three-sample tests, as well as signed rank testing. ‘‘Not applicable’’ (NA) responses were excluded from some analysis based on the coding of 1 = needs improvement, 2 = meets expectations, and 3 = above expectations. Significance was determined at p < 0.05. RESULTS Twenty-six attendings with a mean of 12.2 6 7.1 years of EM experience and 26 senior residents agreed to

4

K. E. Kane et al.

Table 2. Scripted Behaviors in Each Video SDOT Anchor 1. Respectful of privacy 2. Appears professional 3. Uses translation services 4. Efficient information gathering 5. Complaint oriented examination 6. Explains pathophysiology 7. Presents structured case 8. Discusses differential 9. Risks/benefits/indications 10. Critical actions 11. Procedural competency 12. Clear communication 13. Conflict avoidance/resolution 14. Discusses care plan 15. Clinical charting 16. Patient prioritization 17. Contextual use of resources 18. Concern for social constraints 19. Controls distractions 20. Informed decision making 21. Patient reevaluation 22. Documents reassessment 23. Use of resources 24. Discharge planning 25. Completes discharge plan 26. Arranges follow-up

Good

Poor

Test

Examples

ME AE ME ME AE ME AE AE AE ME AE ME NA ME ME ME AE NA AE ME ME NI NA NA NA NA

ME NI NI NI NI NI NI NI NI NI NI NI NI NI NA NA NI NI NI NI NI NA NA NA NA NA

ME ME NI AE ME ME AE ME ME ME AE ME ME NA ME AE AE NA ME ME ME NA NA NA ME NA

Encounter kept separate Appropriateness of attire Ask of services needed Organized and logical Focused on key elements Aware of disease mechanism Organized presentation Consider atypical presentation Discussion of side effects Knows key steps Preparation and completion Courteous and consistent Awareness of sensitive topics Communicates course Timely/complete documentation Acuity takes priority Calls in staff appropriately Considers patient compliance Excuses self when necessary Respects patient wishes Evaluates response to treatment Continued charting Aware of supportive consultants Anticipates patient needs Communicates with patient Ensures care after discharge

AE = above expectations; ME = meets expectations; NA = not applicable; NI = needs improvement; SDOT = Standardized Direct Observation Assessment Tool.

participate in this study. Some familiarity with the SDOT was noted by 8 of the attendings and 11 of the residents. Table 3 depicts the pre- and post-SDOT anchor evaluations submitted by the study participants. As shown in Table 2, the ideal scores (excluding those NA) for each SDOT anchor are in bold type. It is interesting to note how many residents and faculty provided answers for items that could clearly not be evaluated appropriately based on the content of the video (this applies to items 14, 18, 22, 23, 24, and 26, which were scripted as NA). The differences in anchor scores as recorded by attendings and senior residents were rarely significantly different. Only on anchor 20, on the pretest (p = 0.034), and anchor 14, on the post-test (p = 0.041), were the responses from the two groups statistically different. In both cases, the attending’s score was significantly higher than the resident’s. The exposure to the training video did not change this relationship between the groups for their recorded anchors with statistical significance except for anchor 5, where the training lowered resident scores (p = 0.035). Excluding the NA scripted responses; attendings recorded the intended scripted behavior with exact agreement on average 48.73% of the time for each anchor on the pretest. This improved to 54.41% on the post-test. Residents’ recording of exact agreement with intended scripted behavior improved, on average, for each anchor from 45.86% on the pretest to 49.55% on the post-test.

Overall, residents became more critical in their evaluations of the video scripted performance after watching the instructional video in 14 of the 26 questions, less critical in 6 of the 26 questions, and had no change in 6 of 26 questions. Attendings were more critical in their evaluations after watching the instructional video in 10 of 26 questions, less critical in 12 of 26 questions, and showed no change in 4 of 26 questions. Exposure to the video raised faculty scores, although not in a statistically significant manner (p = 0.289) and lowered resident scores significantly (p = 0.046). The recorded responses for the unscripted evaluation of Global Competency are provided in Table 4. The only significant difference between attending and resident assessment of competency was in the pretest scores for medical knowledge. Here, attendings were more likely to rate global medical knowledge lower than residents (p = 0.016). Exposure to the training significantly lowered resident assessment scores as compared to attendings for SBP on the post-test (p = 0.031). This was the only significant change in the relationship between the two groups’ scores as a result of the intervention. DISCUSSION We hypothesized that an SDOT training video would educate both attendings and senior resident evaluators

SDOT: Using a Training Video

5

Table 3. Frequency Tabulation by Anchor, Respondent, Observation Period, and Category Item No. 1 1 1 1 2 2 2 2 3 3 3 3 4 4 4 4 5 5 5 5 6 6 6 6 7 7 7 7 8 8 8 8 9 9 9 9 10 10

Variable Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest

Needs Meets Above Improvement,* Expectations,† Expected,‡ n n n 7 4 5 8 18 7 17

16k 7k k

15

k

15

k

8 5k 3

k k

10

1

11 11

1

11

0 0

11

3

12 12

6

25§ 12§

0 1

0 12{

20§

2

1

23§

1

0

3 1

17 9

1{ 1{

2

18

3{

8

9

5{

5 1

20k 12k

1 0

3

17k

3

0

10k 9k

5 4

2

11k

10

15

14

8

2 0

13 6

11{ 7{

9 0 3

12 k

7 6k k

12

k

9

14 14

15

2

3

13

15 15

1 0

9

13

14

3

12

12

14

14

2

12

13 13

7

k

Item No. 10

2 0

18

k

Table 3. Continued

16 16 16 16

{ {

6 7 8

6

10

6

1 1

9k 10k

4 2

4

9k

10

8

12k

4

3 1

15k 10k

2 2

(Continued )

17 17 17 17 18 18 18 18 19 19 19 19

Variable Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest

Needs Meets Above Improvement,* Expectations,† Expected,‡ n n n 5

12k

2

8

14k

1

1 0

1 1

1{ 1{

0

10

2{

0

7

3{

20 10

5k 3k

1 0

17

4k

2

16

k

7

k

1

16 8

2 0k

0 0

13

3k

2

k

13

6

0

5 2

9 6

0 3

6

9

0

8

5

0

1 1

2k 3k

0 0

0

4k

0

0

7k

0

4 4

13 7

0{ 0{

5

9

3{

8

8

2{

2 0

15 8

0{ 2{

2

9

2{

2

12

2{

3 1

3 3

0 0

2

6

1

2

3

0

k

11 8

13 5k

1 0

10

10k

1

14

k

8

0

(Continued )

6

K. E. Kane et al.

Table 3. Continued

Item No. 20 20 20 20 21 21 21 21 22 22 22 22 23 23 23 23 24 24 24 24 25 25 25 25 26 26 26 26

Variable Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest Faculty pretest Faculty posttest Resident pretest Resident posttest

Needs Meets Above Improvement,* Expectations,† Expected,‡ n n n 1 0

20k 11k

0 2

5

14k

3

5

k

13

2

5 1

7k 6k

0 1

3

6k

1

k

2

5

2 1

2 1

0 0

0

4

0

0

2

0

3 0

1 0

0 0

1

1

1

1

2

0

2 0

2 5

0 1

2

1

2

4

6

1

0 k

2 0

19 1k

1 0

0

5k

3

k

0

1

0 0

1 0

0 0

0

1

1

0

0

0

0

Results in bold type indicate ideal responses. Item numbers 14, 18, 22, 23, 24, and 26 were not able to be assessed in the video. * Needs Improvement score = 1. † Meets Expectations score = 2. ‡ Above Expected score = 3. § Needs Improvement does not meet acceptable resident behavior for PGY level. k Meets expectations meets acceptable resident behavior for PGY level. { Above Expected resident performance better than expected for PGY level.

on the SDOT tool, improving their ability to accurately evaluate scripted behaviors. In this study, exact agreement of submitted SDOTs when compared to the in-

tended, scripted performance for both groups, pretest and post-test, was approximately half. Exact agreement in a prior study averaged 65.99% per anchor (5). The video did improve exact agreement, but only by about 5%. Our agreement is lower than previously published, and may be a result of this study’s smaller sample size. In our pretest/post-test cohort, there are few significant differences between attending and resident scoring of the SDOT. This lack of statistical difference was found both for the majority of the specific anchors, which were scripted and addressed in the training video, as well as the global assessment of competency. Training, in the form of a locally developed instructional video, does not significantly change the relationship between resident and attending SDOT scores. In general, residents tended to be ‘‘more critical’’ after watching the video, while the attendings were ‘‘less critical.’’ However, the attendings’ subsequent changes after the video presented more of a mixed picture. It is interesting to note that after the educational video, the senior residents become significantly more negative in their SDOT evaluations. It may be inferred that after watching the video, residents were comfortable using the SDOT evaluation to provide helpful feedback and constructive criticism, rather than be concerned about critically evaluating their peers. The initial SDOT manuscript noted the challenge of providing negative feedback, so perhaps education around the SDOT tool may help residency programs identify specific areas of weakness for trainees (1). This may be especially true in programs where the faculty is known to be reluctant to provide negative feedback, as has been reported previously (6). Internally, we performed SDOTs using faculty observers who are not providing clinical care and estimate that an SDOT of the initial history and physical as well as a residentattending presentation, can take up to an hour. Remaining in the department to include closure of the case to disposition would, in our experience, require further faculty resources. This concern for faculty time in conjunction with the broad recommendations for the use of direct observation by the Emergency Medicine Milestones document, led to this group suggesting that senior residents could be utilized to perform this laborintensive task (3). In the original SDOT study, the authors note that using the instrument while contemporaneously working clinically may be difficult (1). The use of senior residents to conduct SDOTs may free up faculty resources for other programmatic needs. Limitations This study was completed within a single 4-year, dually approved, residency program, which limits its external validity. Eight of the faculty and 11 of the residents who participated in the study had prior training in evaluation of

SDOT: Using a Training Video

7

Table 4. Pre- and Post-Test Frequency of Responses by Subject Type for Unscripted Global Competencies Variable Patient care Faculty pretest Faculty post-test Resident pretest Resident post-test Medical knowledge Faculty pretest Faculty post-test Resident pretest Resident post-test Practice-based learning and improvement Faculty pretest Faculty post-test Resident pretest Resident post-test Interpersonal and communication skills Faculty pretest Faculty post-test Resident pretest Resident post-test Professionalism Faculty pretest Faculty post-test Resident pretest Resident post-test Systems-based practice Faculty pretest Faculty post-test Resident pretest Resident post-test Overall Faculty pretest Faculty post-test Resident pretest Resident post-test

1

2

3

4

5

3 0 4 7

10 3 10 9

10 5 4 4

1 5 4 4

0 0 0 0

0 0 0 0

6 0 0 2

11 6 7 10

7 7 11 11

1 0 5 1

0 0 0 1

5 0 7 4

15 7 11 13

2 6 5 6

0 0 0 0

7 1 6 10

15 9 14 10

4 3 2 3

0 0 0 1

0 0 1 0

4 1 7 6

17 5 9 12

5 7 5 5

0 0 0 1

0 0 1 0

0 1 0 2

3 0 4 8

17 11 17 13

2 1 2 1

0 0 0 0

7 1 8 4

10 10 8 6

1 0 1 0

0 0 0 0

0 0 0 0

Values are n.

resident performance. Specific training may have included encounters with feedback based on core competency performance. A comprehensive survey outlining the specifics of prior training was not obtained; it should be assumed that prior training may have biased participants’ performance in this study and confounded the results. The small sample size precluded us from separating our analysis by PIFdesignated teaching faculty and clinical faculty. Further, many of the study participants were lost to follow-up. The post-test cohort of 13 attendings was significantly smaller than the size of the pretest cohort, which included 26 members. This represents a substantial loss of study participants, and confounds the impact of the intervention and interpretation of the relationship postintervention between the attendings and senior residents. The largest limitation may be the internally developed video. While the study was a pilot, the quality of both the depiction of the scripted behaviors, as well as that of the educational intervention, has not been validated. This is likely the single largest reason why the exact agreement

in this cohort differs from prior publications (5). Improvement would also be expected after watching the same video for a second time, regardless of the educational impact of the training tool. CONCLUSIONS Exposure to an independently developed training video tended to, without significance, raise attending scores of resident performance, while statistically significantly lowering scores provided by senior residents. This suggests that the senior residents may have been assigning more liberal pretraining scores to their peers. In our cohort, attending physicians and senior emergency residents rarely differ in their application of the SDOT with statistical significance, either before or after the training video. About half of scripted SDOT anchor behaviors were recorded with exact agreement for both groups pre- and post-test. The lack of significant difference between the two groups was found for both scripted anchors and unscripted global assessments. This lack of significant difference suggests that senior residents may be able to function as SDOT assessors. If utilized as SDOT evaluators, senior residents may benefit from training in the use of the instrument. Acknowledgments—The authors would like to acknowledge the research coordination and study supervision efforts of Kimberly M. Hamilton, BA, Valerie A. Rupp, RN, MSN, CRNP, Emese Futchko, MSN, RN, CCRC, Bernadette Glenn-Porter, BS, and Anita Kurt, PhD, RN. They would also like to thank Kristine Petre, MLS, CM, AHIP, for her senior librarian efforts, and Bruce Stouch, PhD, for his statistical consulting expertise. This study was funded in part by an unrestricted educational grant from PCOM MEDNet, Philadelphia, Pennsylvania.

REFERENCES 1. Chapman DM, Hayden S, Sanders AB, et al. Integrating the Accreditation Council for Graduate Medical Education Core competencies into the model of the clinical practice of emergency medicine. Ann Emerg Med 2004;43:756–69. 2. Shayne P, Gallahue F, Rinnert S, et al. Reliability of a core competency checklist assessment in the emergency department: The Standardized Direct Observation Assessment Tool. Acad Emerg Med 2006;13:727–32. 3. Accreditation Council for Graduate Medical Education and the American Board of Emergency Medicine. Single accreditation system for AOA-approved programs. http://acgme.org/acgmeweb/ Portals/0/PDFs/Milestones/EmergencyMedicineMilestones.pdf. Published December 2012. Accessed April 23, 2015. 4. Council of Emergency Medicine Residency Directors. CORDTest. http://www.cordtests.org/SDOT.htm. Published 2014. Accessed April 23, 2015. 5. LaMantia J, Kane B, Yarris L, et al. Real-time inter-rater reliability of the Council of Emergency Medicine residency directors standardized direct observation assessment tool. Acad Emerg Med 2009;16(suppl 2):S51–7. 6. Yarris LM, Linden JA, Gene Hern H, et al. Attending and resident satisfaction with feedback in the emergency department. Acad Emerg Med 2009;16(suppl 2):S76–81.

8

K. E. Kane et al.

ARTICLE SUMMARY 1. Why is this topic important? Emergency attendings’ and senior residents’ Standardized Direct Observation Assessment Tool (SDOT) scores rarely differed; about half of anchor behaviors were recorded with exact agreement. This suggests senior residents can participate in SDOT assessment. 2. What does this study attempt to show? The study’s aim was to assess whether a training video/ DVD improved attendings’ and senior residents’ use of the SDOT. 3. What are the key findings? Exposure to a training video significantly lowered resident scores, suggesting they may have been assigning more liberal pretraining scores to peers. Education to improve their objectivity may be useful. 4. How is patient care impacted? By definition, the SDOT is a Council of Emergency Medicine Residency Directorsdeveloped tool designed to specifically assess residents based on the Accreditation Council for Graduate Medical Education’s Core Competencies. Residents who are better trained in interpersonal medical communication skills will translate into improved, more professional patient care.