TAGEDENe18
ABSTRACTS
OBJECTIVE: Test the efficacy of a workshop intervention on changing resident knowledge, attitudes, and beliefs about progress notes. METHODS: An educational workshop was constructed by residents and faculty stakeholders based on review of the literature, institutional best practices, and a previously designed note assessment tool. Residents from a mid-sized pediatric residency program attended a workshop consisting of best practice didactics and small group work using the tool to assess example progress notes. Participants completed a 22-question online survey (Qualtrics) before and after the workshop to evaluate knowledge of progress note components and attitudes regarding note importance. Pre-post analysis was performed with Chi square testing for true/false questions and Mann-Whitney testing for Likert scale questions. RESULTS: Pediatric residents (n=26, 79% response rate) completed the pre-intervention online survey, and 23 (70%) completed the post-intervention survey. Accurate response rate improved in 15/20 of the true/false content questions, with a statistically significant improvement in five of them (p<0.01). Overall correct answer percentage increased from 78% to 91% (insignificant change). Resident confidence in their ability to write a note and opinion of note importance increased (p=0.01, 0.04). DISCUSSION/CONCLUSION: This study suggests that a workshop intervention is an effective method of educating pediatric residents on progress note best practices. Further studies should assess the impact of the intervention on sustained resident knowledge and beliefs about progress notes and subsequent note quality.
Exam Preparation 37. NEEDS ASSESSMENT OF BOARD PREPARATION CURRICULA AND CERTIFICATION RATES AMONG U.S. PEDIATRIC RESIDENCY PROGRAMS Miki Nishitani, MD, Nicola Orlov, MD, MPH, University of Chicago, Chicago, IL BACKGROUND: The Accreditation Council for Graduate Medical Education (ACGME) requires pediatric residency programs to achieve an overall 70% pass rate on the American Board of Pediatrics Certification Exam. There is high variability in board preparation curricula among programs and minimal evidence showing which are the most effective in producing the highest certification rates. OBJECTIVE: To gain an understanding of the current board preparation landscape across pediatric residency programs and to evaluate the need for individualized and/or focused board preparation curricula. METHODS: A survey was distributed to all U.S. pediatric residency program directors by the Academic Pediatric Program Directors (n=209; response rate=35%). Programs were anonymously asked about their demographics, average in-training examination (ITE) scores and board pass rates, board preparation styles, and whether they are individualized and/or required for certain residents. Survey results were analyzed using descriptive statistics and Fisher’s exact tests. RESULTS: Overall, board preparation consists of a combination of lectures/didactics (n=68, 100%), completion of board-style questions (n=70, 98.5%), and/or a formalized comprehensive review course (n=68, 26.5%). While almost all programs required didactics (91.2%), only about half (52.2%) of programs required
ACADEMIC PEDIATRICS completion of self-directed questions for all residents. ITE scores were the most commonly used means of identifying which residents needed an individualized curriculum. Board pass rates were divided into <70% (below goal), 70-79% (at risk), and >80% (above goal). Seven programs (10%) were considered below goal, while 13 (18.6%) were at risk. Table 1 shows that there is a statistically significant association between a requirement for completing self-directed questions and pass rates (p = 0.03). There were no statistically significant associations between other forms of board preparation and certification rates. CONCLUSION: Future efforts should focus on personalizing each resident’s board preparation curriculum, especially based on ITE scores.
TAGEDPEN
TAGEDFIUR
TAGEDEN
38. PREDICTORS OF PASSING BOARD CERTIFICATION EXAMS IN A MED/PEDS RESIDENCY PROGRAM Daniel R. Wells, MD, Shelley Ost, MD, Michael Kleinman, MD, Natascha Thompson, MD, University of Tennessee, Memphis, TN INTRODUCTION: Combined Internal Medicine and Pediatrics (MP) residencies have primarily relied on categorical program data to predict pass rates for both the American Board of Internal Medicine Certifying Exam (ABIM-CE) and the American Board of Pediatrics Certifying Exam (ABP-CE). However, there is insufficient literature on what constitutes the best predictors of a MP resident passing each. We thus conducted a review of prior exam scores in our large MP program to determine the best predictors of passing both ABIM-CE and ABP-CE. METHODS: Numeric scores from USMLE Steps 1 and 2 and InTraining Exams in Internal Medicine (ITE-IM) and Pediatrics (ITE-P) for UTHSC MP residents over 10 years (2008-2017) were retrospectively reviewed. A total of 91 residents were enrolled. First time ABIM-CE and ABP-CE numeric scores (n=65, n=71 respectively) and pass/fail results (n=91, n=71 respectively) were collected. Linear and logistic regression were applied to determine if a relationship existed between scores and whether numeric ABIM-CE and ABP-CE scores or odds of passing could be predicted based on prior exams. RESULTS: Each prior USMLE, ITE-IM, and ITE-P score had a linear relationship with both ABIM-CE and ABP-CE scores. In the linear regression, adjusted r2 values showed low to moderate predictive ability ranging from 0.10 to 0.34, with the highest predictor of ABIM-CE and ABP-CE being USMLE Step 1 (0.34) and first year ITE-IM (0.32) respectively. Logistic regression showed odds ratios of passing board certifications ranging from