Faculty evaluations: Streamlining the process

Faculty evaluations: Streamlining the process

9-65 Faculty Evaluations: Streamlining the Process Bruce R. Baumgartuer, MD, Emory University Hospital, Atlanta, GA P u r p o s e : Resident evaluatio...

142KB Sizes 0 Downloads 78 Views

9-65 Faculty Evaluations: Streamlining the Process Bruce R. Baumgartuer, MD, Emory University Hospital, Atlanta, GA P u r p o s e : Resident evaluation of the faculty is required by the RRC. Assessment of these evaluations is a time-intensive process in a large residency program. We investigated a n e w method to make this process more efficient and thorough. Materials a n d m e t h o d s : A written evaluation form that includes all faculty in a given division is completed by each resident after each rotation. The number of faculty per division ranges from two to eight. 42 residents with 12 monthly rotations would generate 504 evaluations per year. The University Testing Center developed a computer scannable form. The residents complete these forms by blackening the appropriate circle for each of eight questions for each individual faculty member with w h o m they worked during the rotation. The answer scale ranges from 1-5, Results: After scanning, the results are available in ASCII format. An average evaluation score can be determined for each faculty m e m b e r and for each division from all evaluating residents. The average scores for each question can also be calculated for divisions and individual faculty. C o n c l u s i o n : The use of a scannable evaluation form facilitates faculty evaluations by the residents. L e a r n i n g Objectives: To provide more thorough internal assessment of residency program strengths and weaknesses. To provide more complete and timely feedback to the faculty about their evaluations.

9-66 Radiology Faculty Evaluation: Bi-lnstitutional Implementation of an Experimental Appraisal Instrument Jannette Collins, MD, MF, University of Wisconsin Hospital, Madison, WI,, Mark A. Albanese, PhD, Kathleen A. Scanlan, MD, Pamela A. Prupeck, MD, Valerie P. Jackson, MD P u r p o s e : Determine the reliability and validity of an experimental radiology faculty appraisal instrmnent. Materials a n d M e t h o d s : In a previous study, w e developed a 53 behavioral item experimental faculty appraisal instrument using critical incident interviewing. In this study, 20/20 residents from the University of Wisconsin evaluated 29/33 faculty members, and 37/40 residents from Indiana University evaluated 31/32 faculty members using the experimental instrument. Residents also evaluated faculty using their institution's existing appraisal instrument. Results: Correlations between old and experimental forms were .69 and .87 for University of Wisconsin and Indiana University, respectively. Existing form reliabilities were .89 and .94, and experimental form reliabilities .98 and .98. Experimental form length was reduced to 30 items by eliminating the questions correlating least with section scores. Reliabilities of scores on the shortened form were .97 and .98 and correlated .65 and .88 with scores on the old form. C o n c l u s i o n : Ratings obtained with the existing forms correlated substantially with the experimental form, attesting to the experimental form's validity. The high internal consistency reliability and correlations with the old form of the shortened experimental form indicates that shortening the form had minimal effects on the reliability and validity of the data obtained. F i n a n c i a l D i s c l o s u r e Statement: This study was funded, in part, by a grant from the Office of Medical Education Research and Development, University of Wisconsin Medical Science Center. L e a r n i n g Objectives: 1. Initial implementation of a previously reported experimental radiology faculty appraisal instrument shows the instrument to be valid and reliable. 2. The experimental instrumeut can be significantly shortened, without compromising the data obtained.

758

9-67 Evaluating Didactic Teaching in a Radiology Training Program: Responsiveness of Lecturers to Resident Feedback Bruce H. Lin, MD, University of Chicago, Chicago, 1L Tamar E. Ben-Ami, MD P u r p o s e : To identify universal factors which residents deem more valuable in didactic conferences and assess lecturers' responsiveness to resident feedback. M e t h o d s : Residents' evaluations on faculty's didactic course lectures were reviewed for 3 consecutive years. 23 faculty members from 7 sections consistently gave lectures each year. These were quantitatively evaluated on the following factors: flllfdiment of purpose, audiovisual, handouts, organization/presentation, appropriateness of clinical experience, overall educational value and additional comments. Tabulated results were returned to respective lecturers. Results: Absence or quality of handouts, time management/pace of lectures and level of material were the 3 most commented features. After the first year, 8 (88%) lecturers had declining scores on the quality of handouts. 7 (88%) of them showed improvements the following year. Similarly, 5 of 8 (63%) responded to declining scores on presentation/organization and 7 of 7 (100%) lecturers adjusted content of material to better fulfill the purpose of the lecture. Interestingly, only 1 of 21 lecturers improved on decreasing scores in audiovisual material. C o n c l u s i o n s : Residents identified pace/organization of lectures, handouts, level of material and audiovisual aids as important universal factors in presenting an effective didactic conference. Faculty were generally receptive to such specific quantitative feedback except in the category of audiovisual material. L e a r n i n g Objectives: i. Identify specific features of didactic teachhag sessions which are more important to residents. 2. Factor-specific quantitative feedback from residents can be effectively used to improve didactic teaching conferences.

9-68 Radiology Residency Programs and the Americans with Disabilities Act |ADA) Karen A. Kurdziel, MD, Medical College of Virginia, Richmond, VA P u r p o s e : T h e responsibilities of a radiology residency program with regards to the Americans with Disabilities Act (ADA) lies between that of an employer and that of a training program. This paper explores the current level of understanding of the ADA among the directors of university-affiliated radiology Residency programs. It also provides information useful in ffu'ther educating residency programs on h o w to comply with the ADA. Materials a n d M e t h o d s : 119 university-affiliated radiology Residency programs in the U.S. were surveyed on the perceived accessibility of the hospital and workplace, the program's past experience with disabled residents, and the director's familiarity with the ADA. Results: There was a 45% response rate, with results as follows: Workplace is accessible 91%. Have accessible public transportation (and/or) handicap parking 71%. Have automated fluoroscopy controls or can provide support staff to assist 82%. Familiarity with the ADA 78%. Past experience with a disabled resident/staff 33%. C o n c l u s i o n : Most programs reported their workplaces to be accessible and to have some familiarity with the ADA. However, a more indepth understanding of the ADA and its role in residency programs is needed. This paper explores some of the pertinent aspects the ADA and discusses the responsibilities of both the training program and the individual with a disability. L e a r n i n g Objectives: 1. Understand the importance of being aware of the ADA and h o w it affects radiology residency programs n o w and in the future. 2. Understand key components of the ADA and your role in compliance. 3. Understand that the goal of the ADA is to create equal training opportunities to all residents, including qualified individuals with a disability.