Quality Improvement Educational Practices in Pediatric Residency Programs: Survey of Pediatric Program Directors

Quality Improvement Educational Practices in Pediatric Residency Programs: Survey of Pediatric Program Directors

QUALITY IMPROVEMENT IN EDUCATION Quality Improvement Educational Practices in Pediatric Residency Programs: Survey of Pediatric Program Directors Kei...

486KB Sizes 0 Downloads 95 Views

QUALITY IMPROVEMENT IN EDUCATION

Quality Improvement Educational Practices in Pediatric Residency Programs: Survey of Pediatric Program Directors Keith J. Mann, MD, MEd; Mark S. Craig, MD; James M. Moses, MD, MPH Children’s Mercy Hospitals and Clinics, University of Missouri–Kansas City School of Medicine, MO (Dr Mann); Department of Pediatrics, University of Rochester, Rochester, NY (Dr Craig); and Department of Pediatrics, Boston University School of Medicine, Boston, Mass (Dr Moses) Address correspondence to Keith J. Mann, MD, MEd, Children’s Mercy Hospitals and Clinics, University of Missouri–Kansas City School of Medicine, 2401 Gillham Rd, Kansas City, MO 64110 (e-mail: [email protected]). Received for publication May 7, 2012; accepted November 15, 2012.

ABSTRACT BACKGROUND: The Accreditation Council for Graduate

4.4, range 2–20) involved in teaching residents QI. Programs with more faculty involved were more likely to have had a resident submit an abstract to a professional meeting about their QI project (<5 faculty, 38%; 5–9, 64%; >9, 92%; P ¼ .003). Barriers to teaching QI included time (66%), funding constraints (39%), and absent local QI expertise (33%). Most PPDs (65%) believed that resident input in hospital QI was important, but only 24% reported resident involvement. Critical factors for success included an experiential component (56%) and faculty with QI expertise (50%). CONCLUSIONS: QI curricular practices vary greatly across pediatric residency programs. Although pediatric residency programs commit a fair number of resources to QI education and believe that resident involvement in QI is important, fundamental QI topics are overlooked in many programs, and evaluation of existing curricula is limited. Success as perceived by pediatric program directors appears to be related to the inclusion of a QI project and the availability of faculty mentors.

Medical Education requires residents to learn quality improvement (QI) methods to analyze, change, and improve their practice. Little is known about how pediatric residency programs design, implement, and evaluate QI curricula to achieve this goal. We sought to describe current QI educational practices, evaluation methods, and program director perceptions through a national survey. METHODS: A survey of QI curricula was developed, pilot tested, approved by the Association of Pediatric Program Directors (APPD), and distributed to pediatric program directors. Descriptive statistics were used to analyze the data. RESULTS: The response rate was 53% (104 of 197). Most respondents reported presence of a QI curriculum (85%, 88 of 104), including didactic sessions (83%) and resident QI projects (88%). Continuous process improvement was the most common methodology addressed (65%). The most frequent topics taught were “Making a Case for QI” (68%), “PDSA [plan–do–study– act] Cycles” (66%), and “Measurement in QI” (60%). Projects were most frequently designed to improve clinical care (90%), hospital operations (65%), and the residency (61%). Only 35% evaluated patient outcomes, and 17% had no formal evaluation. Programs had a mean of 6 faculty members (standard deviation

KEYWORDS: curriculum development; quality improvement; resident education ACADEMIC PEDIATRICS 2014;14:23–28

WHAT’S NEW

and in 2002, the Accreditation Council for Graduate Medical Education (ACGME) Outcome Project was launched. Core competencies were defined to guide curriculum development and performance assessment activities for residency training programs with the intent of positively impacting patient outcomes. Two of these competencies are Practice-Based Learning and Improvement and Systems-Based Practice.3 The ACGME also emphasized the importance of teaching quality improvement (QI) through experiential learning and resident engagement in a QI project.3 Pointing to the increased interdependence among practitioners and the system within which they work, Berwick and Finkelstein compared the need for QI education and system level thinking today to the need for standardized medical curricula at the time of Flexner’s report in 1910.4

There is great variability in the design, content, and evaluation of quality improvement (QI) curricula in pediatric residency programs. Most QI curricula integrate didactic learning with QI project work. Pediatric program directors are not satisfied with the current state of QI education and recognize room for improvement in QI curricula.

OVER A DECADE ago, 2 Institute of Medicine reports changed the health care system and medical education. To Err Is Human1 shed light on the frequency and human cost of medical errors, while Crossing the Quality Chasm2 laid the foundation for improvement of the health care system. Educational accrediting bodies soon followed, ACADEMIC PEDIATRICS Copyright ª 2014 by Academic Pediatric Association

23

Volume 14, Number 1 January–February 2014

24

MANN ET AL

ACADEMIC PEDIATRICS

Despite the recognized importance of teaching QI to trainees, best practices in QI education have not been solidly established. The literature discussing QI curricula consists of several systematic reviews or review articles,5–7 as well as other articles specific to individual program curricula.8–11 Several common themes have emerged from these publications: the majority of QI curricula have an experiential or project-based component6,7; a longitudinal curriculum that integrates into the residents’ schedule is an added benefit12,13; common barriers to success include lack of faculty expertise in QI methods,14 lack of institutional support for QI education, and lack of resident interest in QI15; and educational opportunities are lost when hospital-based QI projects are conducted without resident involvement.7,14 Although this literature provides insight about QI teaching, little is known about how pediatric residency programs (PRPs) teach QI to residents nationally. To close this knowledge gap, we surveyed all pediatric program directors (PPDs) in the United States with support from the Association of Pediatric Program Directors (APPD) to define the structure, content, evaluation methods, and outcomes of QI curricula in PRPs. This needs assessment is an important initial step in defining common curricular elements necessary for teaching QI to residents.

METHODS An online questionnaire that focused on QI educational practices in pediatric residency training programs was developed by 2 of the authors (KJM, JMM). The questionnaire targeted 4 curricular domains: 1) curricular design and content, 2) curriculum support, 3) program evaluation, and 4) PPD perspectives. The original draft of the questionnaire was tested with several experts in QI and education, revised on the basis of their feedback, and sent to the APPD Research Task Force for approval. After making further modifications suggested by the Research Task Force, a 46-question survey with skip logic (skip logic, also known as conventional branching, helps direct an individual to the most appropriate set of questions in the survey on the basis of an individual question response) was distributed via the APPD electronic mailing list between September and October 2011 with a link to the survey on SurveyMonkey (http://www.surveymonkey.com/). Questionnaires were sent to 197 PRP directors who were members of the APPD. PPDs were asked to collaborate with colleagues at their institution within graduate medical education and QI to best complete the survey. We excluded pediatric fellowship PPDs, medicine–PRP directors, and associate PPDs to avoid duplicate responses from the same institution. Potential respondents were contacted a maximum of 4 times via e-mail. Voluntary participation, anonymity of responses, and the right of refusal to answer any question were fully explained in all e-mails and within the survey itself. No gift or reward was offered as an incentive. Demographic questions included PRP size as well as association with a freestanding children’s hospital. These

Table 1. Sample Survey Questions by Domain Domain

Question

Answers

 Driver diagrams  Conceptual flow diagrams  Fishbone diagrams  Spaghetti diagrams  Pareto diagrams  Checklists  5 Whys  None  Other Curriculum Are there dedicated  No support staff Support support staff to help  Non-clinical QI experts residents with their  RN QI experts QI projects?(please  Data analysts check all that apply)  Research assistants  Other Program Do you score or grade Yes (please Evaluation resident QI projects? describe) No Program How satisfied are you Likert scale: Director with the QI curriculum 1 (extremely satisfied) to Perspectives you have in place? 5 (not satisfied) Curricular Which, if any, QI tools Design are taught to the and Content residents in the didactic component? (please check all that apply)

QI ¼ quality improvement.

questions were specifically included because we hypothesized that resource allocation may vary on the basis of these factors. Specific questions regarding the 4 topic areas followed. A sample of survey questions is included in Table 1. Survey data were analyzed by SPSS v18 software (IBM, Armonk, NY). Descriptive statistics were summarized as percentages. To determine percentages for specific curricular elements, the denominator for each set of questions was determined by the number of respondents who had that element in place. For example, the denominator for questions pertaining to the overall curriculum includes all respondents, while the denominator for questions about QI projects includes only those programs that have QI projects as part of their curriculum. Frequencies and chi-square tests were used to assess associations between demographic variables, curricular support, abstract submission, and curricular satisfaction. A P value of <.05 was considered significant in all statistical analyses. The Boston University School of Medicine and the Children’s Mercy Hospital Pediatric Institutional Review Boards reviewed and approved the study as exempt.

RESULTS The response rate was 53% (104 of 197). A QI educational program was present in 85% (88 of 104) of residency programs that responded. Table 2 summarizes the demographic data. When asked what calendar year the QI educational program was implemented, responses indicated a slow but steady adoption since 2005 of QI education into pediatric training programs (Fig. 1).

ACADEMIC PEDIATRICS

QUALITY IMPROVEMENT IN PEDIATRIC RESIDENCY PROGRAMS

25

Table 2. Demographics of 104 Respondents*

Survey Question Program size 0–30 residents 31–60 residents 61–90 residents >90 residents Freestanding children’s hospital within program Yes No

QI Curriculum (n ¼ 88)

No QI Curriculum (n ¼ 12)

25 42 10 11

6 5 1 0

P .353

.323 35 (40%) 53 (60%)

3 (25%) 9 (75%)

QI ¼ quality improvement. *Four programs were unsure whether they had a QI curriculum or skipped that question.

CURRICULUM DESIGN AND TEACHING METHODS Didactics or formal lectures occur in 83% (73 of 88) of QI curricula. Various didactic formats were used to teach QI content. The most common didactic format used was traditional noon conferences (37 of 73, 51%). Online self-directed modules were used to teach QI content in 26% of programs (19 of 73), and 19% (14 of 73) used a retreat or workshop format. The length of the curriculum also varied: 22% (16 of 73) of programs taught the content in 1 day or less, while 12% (9 of 73) spread that teaching over 3 years. QI projects are a required component of 88% (77 of 88) of programs with a QI curriculum. The targets of improvement varied among programs: 90% (69 of 77) included group projects to improve clinical care; 65% (50 of 77) included projects to improve hospital operations; 61% (47 of 77) to improve the residency program; and 22% (17 of 77) to improve personal performance (multiple responses were allowed). CURRICULUM SUPPORT Programs had a mean of 6 faculty members (standard deviation 4.4, range 2–20) involved in teaching residents QI. Programs with more faculty involved were more likely to have had a resident submit an abstract about their QI project to a professional meeting (<5 faculty, 38%; 5–9, 64%; >9, 92%; P ¼ .003). Although the majority of QI curricula were grounded in project participation, fewer than half (42%, 32 of 77) provided support staff to aid in project management/ completion, and only 13% (10 of 77) provided monetary

Figure 1. Year quality improvement educational program started, cumulative percentage of programs with curricula (n ¼ 88).

Figure 2. Content of quality improvement curricula (n ¼ 88).

project support (up to $1000). Residents in programs that provided financial support for projects were twice as likely to present their findings (odds ratio 2.11, 95% confidence interval 0.7852–5.6762, P ¼ .136), although this finding was not statistically significant. PROGRAM CONTENT Continuous process improvement was taught most commonly (65%, 57 of 88), followed by the model for improvement (40%, 35 of 88) and Lean and Six Sigma (13%, 11 of 88). Specific QI content included in current educational programs on QI is summarized with frequencies in Figure 2. As shown in Figure 3, tools used in QI are not taught consistently in PRPs; 36% (32 of 88) did not teach any specific QI tools as part of their curriculum. PROGRAM EVALUATION Evaluation of QI educational programs was also inconsistent. Specific evaluation methods included improvement in patient outcomes (35%, 31 of 88), participant satisfaction with individual lectures (32%, 28 of 88), participant satisfaction with the entire curriculum (25%, 22 of 88), resident self-assessment of proficiency (24%, 21 of 88), formal scoring of a QI project using a rubric (22%, 17 of 77), and knowledge acquisition through a pretest–posttest design (11%, 10 of 88). Seventeen percent (15 of 88) of programs had no method to evaluate the success or impact of their curriculum.

Figure 3. Tools taught in quality improvement curricula (n ¼ 88).

26

MANN ET AL

Figure 4. Barriers to quality improvement education.

In PRPs with a formal QI curriculum and required project involvement, at least 1 group of residents has presented their project at local (27%, 24 of 88), regional, (18%, 16 of 88), and national (30%, 26 of 88) meetings. PPDS’ PERSPECTIVES When PPDs were asked when residents should learn QI, 75% (66 of 88) stated longitudinally, over all 3 years. Of the factors reported by PPDs to be critical to successful QI training, an experiential component (project) was considered the most important (56%, 49 of 88), followed by faculty with QI expertise (50%, 44 of 88). Though a majority of PPDs (65%, 57 of 88) believed residents’ input in hospital-based QI projects to be important to extremely important, only 24% (21 of 88) reported that their residents are involved to extensively involved in hospital-wide QI projects. Of PPDs with a QI program, only 23% (20 of 88) reported being satisfied or extremely satisfied with their current curriculum, and 81% (71 of 88) believed that their residents complete training with an intermediate or lower level of QI proficiency. There was no difference in mean 5-point Likert scores for PPD satisfaction (2.77 vs 2.94; P ¼ .38) or perceived trainee proficiency (2.85 vs 3.14; P ¼ .09) when comparing programs with and without support staff dedicated to QI projects. The frequency of reported barriers to successful implementation of a QI curriculum is presented in Figure 4 (multiple responses were allowed).

DISCUSSION Only 85% of PRPs teach QI to their residents despite the ACGME requirement that they do so, and among those with QI curricula, there is great variability in the design, content, and evaluation. Core concepts such as system awareness, measurement, implementing change, and teamwork16 are notably absent in the QI curricular content of most PRPs. The majority of QI curricula are grounded in experiential learning through project participation. The resources dedicated to project support and the commitment to formal project evaluation were highly variable across residency programs. Despite this variability, a mean of

ACADEMIC PEDIATRICS

6 faculty members dedicated to teaching QI suggests a significant educational commitment from many PRPs. When resources were committed to the QI curriculum, there was a higher likelihood that residents would submit their project as an abstract to a local, regional, or national meeting. PPDs cite a common set of barriers to success, including lack of dedicated time, limited funding or resources, limited access to faculty with QI expertise, minimal integration of residents into hospital-based QI projects, and lack of interest by the residents. Most PPDs are not satisfied with the current state of QI education, as indicated by their perceptions of resident QI abilities at program completion, and they reflect that the current structure of residency programs does not provide adequate time for longitudinal QI experiences. The goal of educating residents in QI science and engaging them actively in system-level improvement is widely supported in the literature. Resident physicians are uniquely positioned to understand and provide input into the complexities of the local health care system; they interact with nurses and other care providers, utilize electronic medical records, and care for patients daily across many hospital microsystems.17 A major consistent finding in the literature is that a QI project is a key component of any successful QI curriculum,6,7 and we did find that the majority of responding programs required QI projects of their residents. However, beyond this finding, a literature search for best practices in QI education reveals only broad thematic suggestions for curricular design, so it is hardly surprising that we found wide variability nationwide in the design, content, evaluation, and support systems for QI curricula.8,9,11,18,19 We found that pediatric QI curricula were highly variable in length, from 1 day or less in 22% of programs to 3 years in 12%. There is growing evidence that a longitudinal training experience improves QI education for residents, perhaps because this allows residents to embed QI into their daily experiences, making their learning more sustainable.12,13,17 PPDs appear to understand the benefits of a longitudinal curriculum but find it hard to achieve. They reported that the biggest barrier to successful curriculum implementation is a “lack of time in the current structure of resident education.” It is likely that new recommendations to QI curriculum content and design, once available, cannot be effectively implemented without addressing some of the key barriers reported by PPDs. Regardless, any identified solutions that address these barriers will have to be assimilated into the presentday pediatric residency curricular environment, which emphasizes individualized curriculum, evaluation based on milestones, and consideration of assimilating entrustable professional activities into resident competency-based evaluation.20–22 Collection of evaluation data to assess curricular effectiveness, guide future modifications, and maximize learning and outcomes is a key element of ensuring curricular quality. Only 40% of programs surveyed assessed residents’ knowledge acquisition and/or the impact of the curriculum on practice improvement. The lack of a standard

ACADEMIC PEDIATRICS

QUALITY IMPROVEMENT IN PEDIATRIC RESIDENCY PROGRAMS

QI assessment tool may contribute to this deficiency. Although such tools are available, none has been specifically adapted for use in pediatric QI curricula.19,23,24 Hence, an important next step for PPDs is to develop and utilize assessment tools to measure curricular impact and project evaluation tools to ensure that QI projects across programs are designed to meet expected standards. A simple scoring rubric, developed from the SQUIRE guidelines,25 for example, might help to fill this gap. The tremendous variation in curriculum content, design, and evaluation that exists within PRP QI curricula is likely leading to suboptimal outcomes. Defining a clear set of expectations for QI curricula paired with a clear set of learning objectives may be the first step toward strengthening QI education in residency training. Those expectations should include recommendations on content, design, support, and evaluation with flexibility for individual programs to adapt the recommendations to their individual programs. MedEdPortal (http://www.mededportal. org) already has several examples of resident QI curricula and accompanying evaluation tools. Developing a comprehensive compendium of evidence-based best practices, educational content based on a standardized framework, and well-accepted QI tools would enhance the currently available resources. Such a resource may allow programs with fewer resources and expertise to benefit even more from others further along in curricular development. There are several limitations to this study. A crosssectional study provides no information about past or future QI efforts and does not allow for causal associations between curriculum qualities with learning outcomes. In addition, the survey collected self-report data, which can be subject to social desirability bias, especially because QI education is an ACGME requirement. However, many programs (15%) did admit to being noncompliant with the ACGME mandate, and there was tremendous variability (both positive and negative) in responses, which leads us to believe that we received a representative sample. Although PPDs were asked to collaborate when filling out the survey with those most knowledgeable within their institution about the QI curriculum, we did not ask the PPDs to specifically comment on whether or not that collaboration occurred. Some PPDs may not have known all the details of the QI curriculum asked about in the survey (eg, specific QI tools taught) and may not have taken the time to seek out the answers to all of the detailed questions. Finally, although all PPDs received a survey, only 53% completed it. Those without a QI curriculum may have been more inclined to be nonrespondents. The results might be more widely generalizable if the response rate had been more robust.

CONCLUSION We found great variability in the design, content, and evaluation of QI curricula in PRPs. The majority of curricula, however, are grounded in experiential learning through QI project work. The level of staff support and funding for projects and the commitment to project evalu-

27

ation are inconsistent across the sample QI curricula. Overall, PPDs report dissatisfaction with current state of QI education in their programs, and they note that current residency program structure fails to provide adequate time for optimal longitudinal experiences. Despite these barriers, the majority of programs are teaching QI through experiential learning, and many have residents submitting abstracts to local, regional, and national meetings.

ACKNOWLEDGMENTS We thank the APPD for allowing us to use their resources to build the survey, expertise from their research taskforce to review the survey, and access to membership through their electronic mailing list for survey dissemination. We also thank the PPDs for their thoughtful responses.

REFERENCES 1. Kohn LTCJ, Donaldson MS. To Err Is Human: Building a Safer Health System. Washington, DC: National Academies Press; 2000. 2. US Institute of Medicine. Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press; 2001. 3. Accreditation Council for Graduate Medical Education. Outcome project: competencies. 4. Berwick DM, Finkelstein JA. Preparing medical students for the continual improvement of health and health care: Abraham Flexner and the new “public interest”. Acad Med. 2010;85(9 suppl):S56–S65. 5. Neuspiel DR, Hyman D, Lane M. Quality improvement and patient safety in the pediatric ambulatory setting: current knowledge and implications for residency training. Pediatr Clin North Am. 2009; 56:935–951. 6. Boonyasai RT, Windish DM, Chakraborti C, et al. Effectiveness of teaching quality improvement to clinicians: a systematic review. JAMA. 2007;298:1023–1037. 7. Patow CA, Karpovich K, Riesenberg LA, et al. Residents’ engagement in quality improvement: a systematic review of the literature. Acad Med. 2009;84:1757–1764. 8. Canal DF, Torbeck L, Djuricich AM. Practice-based learning and improvement: a curriculum in continuous quality improvement for surgery residents. Arch Surg. 2007;142:479–482. 9. Djuricich AM, Ciccarelli M, Swigonski NL. A continuous quality improvement curriculum for residents: addressing core competency, improving systems. Acad Med. 2004;79(10 suppl):S65–S67. 10. Mohr JJ, Randolph GD, Laughon MM, et al. Integrating improvement competencies into residency education: a pilot project from a pediatric continuity clinic. Ambul Pediatr. 2003;3:131–136. 11. Tomolo AM, Lawrence RH, Aron DC. A case study of translating ACGME practice-based learning and improvement requirements into reality: systems quality improvement projects as the key component to a comprehensive curriculum. Postgrad Med J. 2009;85(1008): 530–537. 12. Vinci LM, Oyler J, Johnson JK, et al. Effect of a quality improvement curriculum on resident knowledge and skills in improvement. Qual Saf Health Care. 2011;19:351–354. 13. Philibert I. Involving residents in quality improvement: contrasting “top down” and “bottom up” approaches. Accredidation for Graduate Medical Education and Institute for Healthcare Improvement 90 day project. 2008. 14. Cooke M, Ironside PM, Ogrinc GS. Mainstreaming quality and safety: a reformulation of quality and safety education for health professions students. BMJ Qual Saf. 2011;20(suppl 1):i79–i82. 15. Wittich CM, Beckman TJ, Drefahl MM, et al. Validation of a method to measure resident doctors’ reflections on quality improvement. Med Educ. 2010;44:248–255. 16. Langley GJMR, Nolan KM, Nolan TW, et al. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. 2nd ed. San Francisco, Calif: Jossey-Bass; 2009.

28

MANN ET AL

17. Moses J, Shore P, Mann KJ. Quality improvement curricula in pediatric residency education: obstacles and opportunities. Acad Pediatr. 2011;11:446–450. 18. Buckley JD, Joyce B, Garcia AJ, et al. Linking residency training effectiveness to clinical outcomes: a quality improvement approach. Jt Comm J Qual Patient Saf. 2010;36:203–208. 19. Varkey P, Natt N, Lesnick T, et al. Validity evidence for an OSCE to assess competency in systems-based practice and practice-based learning and improvement: a preliminary investigation. Acad Med. 2008;83:775–780. 20. Accreditation Council for Graduate Medical Education. Common program requirements. Available at: http://www.acgme-nas.org/ assets/pdf/CPR-Categorization-TCC.pdf. Accessed April 16, 2012. 21. Hicks PJ, Englander R, Schumacher DJ, et al. Pediatrics milestone project: next steps toward meaningful outcomes assessment. J Grad Med Educ. 2010;2:577–584.

ACADEMIC PEDIATRICS 22. Mulder H, Ten Cate O, Daalder R, et al. Building a competencybased workplace curriculum around entrustable professional activities: the case of physician assistant training. Med Teach. 32:e453–e459. 23. Leenstra JL, Beckman TJ, Reed DA, et al. Validation of a method for assessing resident physicians’ quality improvement proposals. J Gen Intern Med. 2007;22:1330–1334. 24. Morrison LJ HL, Ogrinc G, Foster T. The Quality Improvement Knowledge Application Tool: an instrument to assess knowledge application in practice based learning and improvement. Paper presented at: Society of General Internal Medicine; May 7, 2003; Vancouver, BC. 25. Ogrinc G, Mooney SE, Estrada C, et al. The SQUIRE (Standards for QUality Improvement Reporting Excellence) guidelines for quality improvement reporting: explanation and elaboration. Qual Saf Health Care. 2008;17(suppl 1):i13–i32.