Impact of Electronic Resident Evaluation Systems on the Timeliness, Rate, and Quality of Responses

Impact of Electronic Resident Evaluation Systems on the Timeliness, Rate, and Quality of Responses

Impact of Electronic Resident Evaluation Systems on the Timeliness, Rate, and Quality of Responses Survey of Association of Program Directors in Radio...

82KB Sizes 1 Downloads 30 Views

Impact of Electronic Resident Evaluation Systems on the Timeliness, Rate, and Quality of Responses Survey of Association of Program Directors in Radiology Members Phillip M. Boiselle, MDa,b, Martha B. Mainiero, MDc

Objective: To determine the impact of electronic resident evaluation systems on the timeliness, rate, and quality of responses. Methods: Surveys were mailed electronically to the membership of the Association of Program Directors in Radiology, which included the directors of 158 residency programs. Respondents were instructed to send 1 response from each program. Information gathered included the use of electronic compared with paper-based evaluation systems and the overall level of satisfaction with such systems (rated on a 5-point scale with 1 ⫽ dissatisfied and 5 ⫽ satisfied). Questions were asked regarding the impact of electronic systems on the timeliness, rate, and quality of responses. Results: Seventy-seven responses were received, for an estimated response rate of 49% on a per-program basis. Of these 77 respondents, 45 (58%) used electronic systems and 32 (42%) used paper-based systems. The median level of satisfaction was equivalent (4 ⫽ somewhat satisfied) for both groups. Of the 45 respondents using electronic systems, 26 (58%) reported increased response rates, 16 (36%) reported no change, and 3 (7%) reported decreased response rates compared with paper-based systems. Regarding the timeliness of responses, 31 (69%) reported faster response rates, 10 (22%) reported no change, and 4 (9%) reported slower rates compared with paper-based systems. Regarding the quality of responses, 25 (56%) reported no change, 12 (27%) reported improved quality, and 8 (18%) reported worse quality compared with paper-based systems. Conclusion: Electronic systems are generally associated with an improved response rate and enhanced timeliness of responses in comparison with paper-based systems, without adversely affecting the quality of responses. Key Words: Radiology resident evaluation, electronic evaluation J Am Coll Radiol 2006;3:807-811. Copyright © 2006 American College of Radiology

INTRODUCTION The regular evaluation and feedback of residents are essential components of residency training [1-3]. Evaluations are a

Department of Radiology, Beth Israel Deaconess Medical Center, Boston, Mass. b Department of Radiology, Harvard Medical School, Boston, Mass. c Department of Radiology, Rhode Island Hospital and Brown University School of Medicine, Providence, RI. Corresponding author and reprints: Phillip M. Boiselle, MD, Beth Israel Deaconess Medical Center, Department of Radiology, 330 Brookline Avenue, Boston, MA 02215; e-mail: [email protected]. © 2006 American College of Radiology 0091-2182/06/$32.00 ● DOI 10.1016/j.jacr.2006.02.032

not only desirable but are required by the Accreditation Council for Graduate Medical Education (ACGME) [4], which states that “evaluations of each resident’s progress and competence should be conducted preferably at the end of each rotation, but not less than four times yearly.” Traditionally, such evaluations have been performed using paper-based systems, which are limited by their labor-intensiveness and challenges related to distribution, compliance, confidentiality, and the compilation and reporting of data [5]. Recently, web-based, electronic evaluation systems have been introduced, offering the potential to overcome some of these limitations. To 807

808 Journal of the American College of Radiology/ Vol. 3 No. 10 October 2006

date, however, there are only limited data from single institutions regarding the impact of electronic systems [5-7]. We undertook this survey to assess the current practices of the members of the Association of Program Directors in Radiology (APDR) regarding the use of electronic resident evaluation systems. We were specifically interested in learning about the impact of adopting an electronic system on response rate, the timeliness of responses, and the quality of responses. We were also interested in determining whether those respondents using electronic evaluation systems were following the ACGME’s recommendations with regard to maintaining hard-copy records of certain types of resident evaluations [8]. In this article, we discuss the results of our survey and review the ACGME’s recommendations for electronic evaluation systems. MATERIALS AND METHODS In October 2005, we electronically mailed a questionnaire regarding resident evaluation systems to the electronic addresses of the membership of the APDR, which included the directors of 158 residency programs. Respondents were instructed to send 1 response from each program. The survey was electronically mailed a second time in November 2005 to enhance the response rate. The specific survey questions are listed in the Appendix. The initial question asked whether a respondent’s program used an electronic or paper-based evaluation system. Those who replied “paper-based” were directed to the final 3 questions (8, 9, and 10) of the survey, which asked the respondents to rate their overall levels of satisfaction with their evaluation systems using a 5-point scale (1 ⫽ dissatisfied, 2 ⫽ somewhat dissatisfied, 3 ⫽ neutral, 4 ⫽ somewhat satisfied, and 5 ⫽ satisfied) and to identify their types of practice setting (academic, private, or mixed) and the sizes of their residency programs. For those who replied “electronic” to the first question, subsequent questions gathered information about the types of system used (commercial systems or systems created by their universities, hospitals, or departments) and the perceived impact of adopting an electronic systems on response rate (defined as the percentage of completed evaluation forms received), the timeliness of responses (defined as the speed with which the forms are completed and returned), and the quality of responses (defined as the amount of detailed, descriptive information that is provided about residents’ performance) in comparison with when the respondents used paper-based systems. Additional questions gathered information regarding the maintenance of hard-copy records of evaluations. Finally, these respondents were also asked to com-

plete the 3 common questions (8, 9, and 10) for all respondents at the end of the survey. The 2 groups of respondents (those using electronic systems and those using paper-based systems) were compared with regard to practice setting and size of residency program. Fisher’s exact test was used to determine the statistical significance of these differences, with P values less than .05 considered statistically significant. RESULTS Seventy-seven completed responses were received, for an estimated response rate of 49% on a per-program basis. Of these 77 respondents, 45 (58%) used electronic evaluation systems and 32 (42%) used paper-based evaluation systems. When the 45 respondents using electronic systems were asked to estimate the impact of changing from paper-based systems to electronic systems on their evaluation response rates, 26 (58%) reported increases, 16 (36%) reported no change, and 3 (7%) reported decreases compared with when they used paper-based systems. Regarding the timeliness of responses, 31 (69%) reported faster responses, 10 (22%) reported no change, and 4 (9%) reported slower responses compared with paper-based systems. Regarding the quality of responses, 25 (56%) reported no change, 12 (27%) reported improved quality, and 8 (18%) reported worse quality compared to when they used paper-based systems. Of the 45 respondents using electronic systems, 34 (76%) used commercial systems, 6 (13%) used systems developed by their hospitals or universities, and 5 (11%) used systems developed by their departments. Of the 34 respondents using commercial systems, the most commonly reported vendors were New Innovations, Inc. (Uniontown, Ohio), n ⫽ 17, and E*Value (Advanced Informatics, LLC, Minneapolis, Minn), n ⫽ 10. Regarding the duration of time with which the 45 respondents had been using electronic evaluation systems, 31 (69%) reported more than 2 years, 5 (11%) reported 1 to 2 years, and 9 (20%) reported 6 months to 1 year. Regarding the maintenance of paper (hard-copy) records of electronic evaluations, 26 of 45 respondents (58%) reported that they maintained paper records for all resident rotation evaluations, 34 (76%) reported that they maintained paper records for rotation evaluations of residents with academic or other performance problems, and 42 (93%) reported that they maintained paper records for residents’ final evaluations at the completion of training. The median level of satisfaction was equivalent (4 ⫽ somewhat satisfied) for the groups of respondents using electronic and paper-based systems. The practice settings and sizes of the residency programs for these 2 groups are

Boiselle, Mainiero/Electronic Resident Evaluation Systems 809

Table 1. Practice settings of respondents Practice Electronic System Paper System Setting (n ⴝ 45) (n ⴝ 32) Academic 38 (85%) 19 (59%) Private 2 (4%) 3 (9%) Mixed 5 (11%) 10 (31%)

compared in Tables 1 and 2, respectively. Those using electronic systems were more likely to practice in academic settings than those using paper-based systems (P ⫽ .018). Although there was a trend for those using electronic systems to be associated with larger residency programs than those using paper-based systems, this difference did not meet statistical significance (P ⫽ .07). DISCUSSION Evaluation and feedback are essential components of residency training and requirements of the ACGME [1-4]. Although paper-based systems have traditionally been used for resident evaluations, our results show that a slight majority of APDR members are now using electronic evaluation systems. Our results also suggest that the use of electronic systems is generally associated with an improved response rate and enhanced timeliness of responses compared with paper-based systems, without adversely affecting quality. Gale et al [9] reported the use of a computerized approach to radiology resident evaluations in 1997. Despite subsequent reports in the literature about the potential benefits of electronic evaluation systems [5-7], radiology residency programs have been relatively slow to adopt electronic systems during the past decade. Collins et al [10] surveyed radiology residency programs in January 2003 and found that 19 of 99 responding programs reported using electronic systems at that time. The actual prevalence of electronic survey use at that time is difficult to ascertain, however, because fewer than one third of respondents completed the survey questions related to use of electronic systems. Our survey, which was performed 22 months later, found that 45 of 77 responding programs were using electronic systems. Considering the fact that almost half of all programs responding to our survey were still using paper-based systems, there is a need for data regarding the type of results that one can reasonably expect from adopting an electronic evaluation system. In the following paragraphs, we discuss our results regarding the reported impact of adopting an electronic evaluation system on 3 important endpoints: the timeliness, rate, and quality of responses. Paper-based evaluation systems are limited by inherent delays related to the distribution and return of sur-

veys, which are often exacerbated by inefficient internal mailing systems in many hospitals [5]. In contrast, electronic systems allow for rapid notification that evaluations are due and the electronic submission of evaluations [5]. Thus, one would anticipate an enhanced timeliness of responses with electronic systems. In our survey, the majority (68%) of respondents using electronic systems reported an enhanced timeliness of responses compared with when they used paper-based systems. It is important to note that the enhanced timeliness of evaluations has the potential to aid in the prompt identification of and response to resident performance concerns [5]. Paper-based systems have also traditionally been limited by suboptimal compliance [5]. A previous study by Williamson et al [6] reported a marked increase in the response rates of evaluations at their institution after the introduction of a web-based evaluation system. In our survey, a slight majority (58%) of respondents using electronic evaluations reported increases in response rates compared with when they used paper-based systems. It is interesting to note, however, that a substantial minority of our respondents did not perceive increases in response rates when they changed to electronic systems. However, because faculty members’ compliance with resident evaluation systems is likely related to multiple factors (including individual motivation, perceived significance, rewards, and costs), it is not surprising that the use of an electronic system does not guarantee improved compliance for all institutions. Quality is a critical issue in any evaluation process, and the importance of detailed, descriptive information about residents’ performance should not be overlooked when assessing an evaluation system. A previous study of an online faculty appraisal instrument by Boiselle et al [7] showed that a detailed narrative evaluation could be successfully implemented electronically. In that study, a vast majority of residents and faculty members found the narrative components of evaluations more helpful than the quantitative aspects. In our survey, a slight majority (56%) of respondents using electronic evaluation systems reported no change in the quality of responses compared with when they used paper-based systems, whereas roughly one quarter reported improved quality. Thus,

Table 2. Sizes of residency programs of respondents Electronic Size of System Paper System Program (n ⴝ 45) (n ⴝ 32) ⬍30 residents 16 (36%) 27 (84%) ⬎30 residents 29 (64%) 5 (16%)

810 Journal of the American College of Radiology/ Vol. 3 No. 10 October 2006

electronic evaluation systems can be used without sacrificing quality. Electronic evaluation systems have also been reported to have additional advantages over paper-based systems, including the easier compilation and tracking of data, the ability to electronically send reminders and reports to residents and faculty members, and the need for less administrative time [5]. We did not assess these specific measures in our survey. Regarding potential disadvantages of electronic systems, because most programs are using commercially based software programs, there is a direct cost involved. Although this factor is likely offset by reduced costs for administrative support, we emphasize that a cost-benefit analysis is beyond the scope of this article and that it will likely differ for individual programs depending on residency program sizes and the types of electronic systems used. In recognition of the increased use of electronic evaluation systems by residency programs, the ACGME and the Council of Review Committee Chairs have provided recommendations for those using electronic systems [8]. They state that paper records should always be maintained for the final resident evaluations at the end of training. On the basis of our survey results, the vast majority (93%) of programs using electronic evaluations were compliant with this recommendation. The ACGME also recommends maintaining hard-copy records for residents with academic or other performance problems, because electronic evaluations may not be sufficient in cases in which remediation, probation, or dismissal needs to be documented [8]. On the basis of our survey results, nearly one quarter of those respondents using electronic evaluations were not compliant with this recommendation. We anticipate that many of those who were not compliant may be unaware of these recommendations. Presently, the majority of APDR members who are using electronic evaluation systems are in academic settings. Considering the benefits reported with electronic systems, we anticipate that a growing number of program directors in all practice settings will adopt an electronic format for evaluations in the near future. It is important to note that the use of electronic systems does not entirely replace paper records. Thus, residency program directors using electronic systems should be familiar with current ACGME recommendations for hard-copy records of certain evaluations. In summary, electronic systems are used by a slight majority of APDR members and are generally associated with an improved response rate and enhanced timeliness of responses in comparison with paperbased systems, without adversely affecting the quality of responses.

REFERENCES 1. Collins J. Evaluation of residents, faculty, and program. Acad Radiol 2003;10(suppl 1):S35-43. 2. Gunderman RB. Feedback in radiologic education. Acad Radiol 2002;9: 446-50. 3. Boiselle PM. A remedy for resident evaluation and remediation. Acad Radiol 2005;12:894-900. 4. Appendix 1: ACGME program requirements for residency education in diagnostic radiology. Acad Radiol 2003;10(suppl 1):S102-8. 5. Rosenberg ME, Watson K, Paul J, et al. Development and implementation of a web-based evaluation system for an internal medicine residency program. Acad Med 2001;76:92-5. 6. Williamson KB, Jackson VP, Shuman LA, Stiefel MD, Gunderman RB. Online evaluation in radiology residency programs. Acad Radiol 2003;10:83-6. 7. Boiselle PM, Jennette R, Donohoe K. Evaluation of an online faculty appraisal instrument: comparison of resident and faculty perceptions. Acad Radiol 2004;11:1071-77. 8. Accreditation Council for Graduate Medical Education. Electronic evaluation systems. Available at: http://www.acgme.org/acWebsite/ fieldStaff/fs_electEvalSystem.asp. Accessed January 17, 2006. 9. Gale ME. Resident evaluations: a computerized approach. Am J Roentgenol 1997;169:1225-8. 10. Collins J, Herring W, Kwakwa F, et al. Current practices in evaluating radiology residents, faculty, and programs: results of a survey of radiology residency program directors. Acad Radiol 2004;11:787-94.

APPENDIX Questionnaire 1. Which ONE of the following best describes your current evaluation system (check one)? a. Paper-based system. b. Electronic system. If you answered “a” (paper-based system), go directly to question 8. If you answered “b” (electronic system), go to question 2. 2. Check the ONE item that best describes your electronic evaluation system: a. Commercial system (purchased from a vendor). List manufacturer (if known): ____________ b. Hospital or university-wide system (developed by your hospital or university but not specifically by your radiology department). c. Developed by your radiology department. d. Other (please describe: __________________________________). 3. Approximately how long have you been using an electronic evaluation system? (Check the ONE option that most closely applies): a. ⬍6 months b. 6 months to 1 year c. 1 to 2 years d. ⬎2 years

Boiselle, Mainiero/Electronic Resident Evaluation Systems 811

4. Compared to when you used a paper-based evaluation system, what has been the impact of the change to an electronic evaluation system on your evaluation response rate (percentage of completed evaluation forms)? (Check the ONE option that most closely applies): a. No change. b. Increase. Estimated percentage increase in response rate: ______ c. Decrease. Estimated percentage decrease in response rate: ______ 5. Compared to when you used a paper-based evaluation system, what has been the impact of the change to an electronic evaluation system on the timeliness of responses (the speed with which the forms are completed and returned to you)? (Check the ONE option that most closely applies): a. No change. b. Faster. c. Slower. 6. Compared to when you used a paper-based evaluation system, what has been the impact of the change to an electronic evaluation system on the quality of responses (eg, the amount of detailed, descriptive information that you receive about resident performance)? (Check the ONE option that most closely applies): a. No change. b. Better quality. c. Worse quality. 7. For each of the following three items (a, b, c), please answer “yes” or “no” regarding whether

you maintain paper (hard copy) records for the following: a. All resident rotation evaluations. b. Rotation evaluations of residents with academic or other performance problems. c. Final evaluation at the completion of training. 8. What is your overall level of satisfaction with the evaluation system that you are using? Choose a number between 1 and 5 on the following scale: 1- Dissatisfied 2- Somewhat dissatisfied 3- Neutral 4- Somewhat satisfied 5- Satisfied 9. What is the approximate size of your residency program (total number of residents in the entire program, rather than number of residents per year)? a. ⬍12 b. 13-20 c. 21-30 d. 31-40 e. ⬎40 10. Which ONE of the following MOST CLOSELY describes the practice setting of your program: a. Academic/university hospital. b. Private practice. c. Mixed academic and private practice. Please feel free to add additional narrative comments in the following text box: