Key Criteria for Selection of Radiology Residents: Results of a National Survey1 Hansel J. Otero, MD, Sukru M. Erturk, MD, Silvia Ondategui-Parra, MD, MPH, Pablo R. Ros, MD, MPH
Rationale and Objectives. We sought to identify the criteria that academic radiology departments in the United States consider for selecting their residents. Materials and Methods. In a cross-sectional study, a validated survey was sent to all the program directors of radiology residency programs. A total of 25 variables were studied. Descriptive statistics and correlations were calculated by the 2 test. Nonparametric correlations were calculated with the Kruskal-Wallis rank test. Statistical significance was set at 5% ␣-error level (P ⬍ .05). Results. We had a response rate of 53.1% (77 of 145). All responders participate in the National Resident Matching Program (NRMP), and 93.5% fill all their positions through NRMP. The preinterview selection criteria showed no significant difference by size, region, or affiliation with a medical school. An “interviewing body” carries out the interview process in 87.3% of the cases. Residents and fellows are part of the interviewing body in 76.5% of the programs, the body has the final word in accepting candidates in 62.9% of the programs, 55.4% of the programs use score sheets during interviews with candidates, and only 6.5% of the programs perform panel interviews. Programs associated with a medical school are significantly more likely to have more members in their interviewing body and to use score sheets when evaluating candidates, and panel interviews (more than one candidate or interviewer) are significantly more common among programs in the northeast region. Conclusion. All preinterview selection criteria and some interview structural characteristics are independent of the program’s size, region, or affiliation with a medical school. More research regarding optimal preselection and interview processes is needed, and closer attention should be paid to the NRMP process if current practices are to be maintained. Key Words. Radiology residency; candidates selection. ©
AUR, 2006
Directors of radiology residency programs share the ultimate goal of choosing the most competent and proficient candidates for their programs by using selection measures that will predict residency performance (1) and identify residents who are the best matches for the department and for whom the department is the best choice (2). The selection process usually includes a preselection review of medical school performance and an interview phase car-
Acad Radiol 2006; 13:1155–1164 1 From the Department of Radiology, Brigham and Women’s Hospital, 75 Francis Street, Boston, MA 02110. Received April 17, 2006; accepted June 20, 2006. Address correspondence to: H.J.O. e-mail:
[email protected]
© AUR, 2006 doi:10.1016/j.acra.2006.06.012
ried out before deciding which candidates will be accepted. Several studies to identify which selection criteria best predict in-training performance of radiology residents (1, 3–5) have consistently found a lack of correlation between academic performance during medical school or preclinical training and later performance during residency training (1, 6 –10), suggesting that the assessment of noncognitive abilities is key for successful recruitment (1, 11, 12). Radiology is a specialty in high demand; each year residency programs receive hundreds of applications for a few residency posts, and great financial and human resources are used to screen and evaluate those applicants (13, 14). The success of the recruitment process determines the success of the program if we assume that “a
1155
OTERO ET AL
program is only as good as its residents” (2). Surveys used to evaluate current practices and to gain a better understanding of recruiting trends suggest that the process for recruiting radiology residents is highly variable and somehow correlated with the location and type of program (15). However, to the best of our knowledge, no current information about this process has been published, and new trends have not been described in more than a decade. We tried to identify the criteria for preselection of candidates, to describe the interview process, and to determine how the final decision-making is organized in radiology residency programs, while attempting to identify significant differences in these practices among regions and different size departments and hospitals across the United States.
MATERIALS AND METHODS Survey We conducted a cross-sectional multi-institutional survey study among academic radiology departments across the United States from August 2005 to January 2006 to identify the criteria they consider in selecting their residents. Our institutional review board approved the study. The 145 program directors listed by Practice Sight, Inc. received an email explaining the purpose of our study, with an embedded link to complete the survey as well as a link to remove their names from the list of recipients if they did not wish to participate (16). The survey was available online through a web-based commercial site (surveymonkey.com) (17). The process was automated, and questionnaires were sent automatically again to nonresponding hospitals for a total of six rounds at 2-week intervals. The responses were kept confidential and anonymous. The survey took approximately 10⫺15 minutes to complete. Because none of the questions were mandatory, not all of the returned questionnaires were answered completely, a fact taken into consideration in the results section. The questionnaire initially assessed the general organizational characteristics of the institutions surveyed, including region, number of beds, and volume of radiologic examinations. The remaining questions were divided into three sections, including 1) preresidency indicators, 2) intraresidency indicators, and 3) postresidency indicators, were studied. For this article, we analyzed the 25 variables studied under the preresidency indicators section,
1156
Academic Radiology, Vol 13, No 9, September 2006
Table 1 Sample Demographics
U.S. region Pacific Southwest Midwest Northeast South Hospital size (operational beds) Less than 200 beds Between 200 and 500 beds More than 500 beds Department size (exams/year) Less than 200,000 Between 200,000 and 400,000 More than 400,000 Association with a medical school Yes No
Number of Programs
Percentage
11 6 19 28 13
14.3% 7.8% 24.7% 36.4% 16.9%
5 24 48 17 37 23
6.5% 31.2% 62.3% e 22.08% 48.05% 29.87%
64 6
91.43% 8.57%
including eight open-ended questions. The questionnaire is included as Appendix 1. Statistical Methods First, a descriptive analysis was performed to assess the general characteristics of the responding institutions and the organization, characteristics, and mechanism used to select residents. Second, departments were grouped according to geographic region (Pacific, Southwest, Midwest, Northeast, and South), number of operational beds (⬍200, 200⫺500, and ⬎500 beds), radiologic examination volume (⬍200,000, 200,000⫺400,000, and ⬎400,000 examinations performed per year), and association or lack of association with a medical school (see sample demographics in results section, Table 1) and compared by a Pearson 2 test. Third, nonparametric correlation analysis with the Kruskal-Wallis rank test was performed to assess the importance assigned to the selection criteria for each of the candidates. Statistical significance was set at 5% ␣-error level (P ⬍ .05).
RESULTS General Characteristics A total of 77 (53.1%) of the 145 surveyed program directors responded to the questionnaire; 27 (36%) of the responding programs were in the Northeast, 18 (24%)
Academic Radiology, Vol 13, No 9, September 2006
KEY CRITERIA FOR SELECTION OF RADIOLOGY RESIDENTS
Figure 2. Average number of one-on-one interviews that each candidate undergo before being accepted. Figure 1. Relative Importance (0 ⫽ lowest; 10 ⫽ highest) of individual criterion for preselection of candidates before interview.
were in the Midwest, 13 (17.3%) were in the South, 11 (14.7%) were in the Pacific region, and 6 were in the Southwest. The response rates by region were as follow: Northeast: 27 of 50 (54%), Midwest: 18 of 36 (50%), South: 13 of 28 (46%), Pacific: 11 of 17 (70%), and Southwest: 6 of 14 (43%). Of the responders, 48 (64%) belonged to hospitals with more than 500 operational beds; 23 (30.7%), to hospitals with 200⫺500 beds; and 4 (5.3%), to hospitals with fewer than 200 beds. Thirty-six (48%) of the programs are part of a radiology department that performs between 200,000 and 400,000 examinations per year, 22 (29.3%) perform more than 400,000 examinations per year, and 17 (22.7%) perform fewer than 200,000 examinations per year. Of the programs, 71 (93.5%) were associated with a medical school. General demographics of the sample are summarized in Table 1. Candidates’ Preselection Process Responding programs assigned a 0⫺10 score to determine the importance given to each criterion in selecting candidates for interview (Fig. 1). According to respondents, USMLE scores were the most important criterion, with a score of 8.65; followed by a dean’s letter (7.52), class ranking during medical school (7.50), letters of recommendation (7.36), and honor society membership (7.24). The least important criteria were volunteer and employment experience, with scores of 5.02 and 5.07, respectively. Other criteria also stated were prior rotation to the institution, social engagement, and GPA during
medical school. All these criteria are considered to decide selection for interview. Interviewing Applicants Of the responding programs, 87.3% use an interviewing body for the interview process, 76.5% of which include residents and fellows; 44.6% of the interviewing bodies had five or fewer members; only 10.7% had more than 11 members. The number of members is significantly higher (P ⬍ .05) among programs associated with a medical school. All members of the interviewing body vote in ranking of candidates in 88.1% of the programs; not all the members voting is significantly (P ⬍ .05) more common among midsize departments (100,000 – 400,000 examinations per year). In 54.2% of the programs, all members of the interviewing body interview all applicants. The interview includes use of a formal checklist in 60% and score sheets to rate performance in 55.4% of the programs. The use of score sheets was significantly higher (P ⬍ .05) among programs associated with a medical school (58%). Only five (6.5%) of the respondents’ programs use panel interviews (multiapplicants or multi-interviewers at the same time); this practice is significantly greater (P ⬍ .05) among programs in the Northeast. In 87.1% of the programs, applicants undergo three or more one-on-one interviews (Fig. 2). Final Decision In an open-ended question, 15 directors stated that the “fit” of candidates in the program and a “gut” feeling were the most important criteria for deciding admission. The interviewing body is responsible for making the final ranking in 62.9% of the programs, while the program director has the final word in 33.8%. This variable was in-
1157
OTERO ET AL
dependent of hospital and department size, association with a medical school, and regional location of the program. All the surveyed programs participate in the National Residents Matching Program (NRMP or “Match”), and the great majority (93.5%) of the programs fill all their positions through the Match. This process couples the preselection screening with the interview process.
DISCUSSION We identified the criteria used by radiology residency programs in the United States to select their residents and the mechanism by which they carry out the selection process. By using a user-friendly Web-based service, we achieved a good response rate that we believed to represent, both geographically and by size, a good sample and that allowed us to make some generalizations about the residency selection process. This study provides the most comprehensive current empirical data regarding the factors directors of diagnostic radiology programs consider most important during the residency selection process. We found that USMLE performance was assigned the highest score in selecting the applicants for an interview. Several medical specialties have reported this as a common practice (4, 18 –22). Although the evidence is controversial because some studies have reported that USMLE scores correlate with residents’ performance and others have reported the opposite (4, 23, 24), one 2000 study found that NBME/USMLE scores were not predictive of success in the American Board of Radiology (ABR) examination (4), while another 2002 study found that they were (3). Other criteria ranked as very important were a dean’s letter, letters of recommendation, and honor society membership. Honor society membership has been reported as not predictive of residents’ performance during post-graduate training (1). Moreover, behavioral and other noncognitive skills have been shown to be more important in predicting resident success than cognitive skills (1, 13). Letters of recommendation were found frequently deficient in data regarding noncognitive variables, and the research group proposed standardized statements of recommendation as an effective substitution of traditional letters (25). In an editorial letter many years ago, Friedman described letters of recommendation, including deans’ letters, as products of fantasy (5). More recent radiology-specific opinions appear not to disagree with the
1158
Academic Radiology, Vol 13, No 9, September 2006
idea (2). Program directors gave a high importance to class rank (7.50), but we did not find previous studies correlating this specific criterion to later performance. We believe that this point out an opportunity for future research. The interview process is highly variable among programs. We did not find a pattern of interview structure with regard to region, hospital or departmental size, association with a medical school, or number of residents to be accepted, although early work by Gong and Galazca suggested that the interview has a major influence on the interviewer’s ranking of an applicant and the training program Match list (6, 26). The validity of the use of the personal interview as part of the selection process has been questioned (27), but it is seen as an opportunity to evaluate important noncognitive abilities. Once selected for interview, the candidates must be considered equally in a fair and unbiased process (2), which can be ensured by promoting the use of score sheets and checklists (28, 29). The idea of standardized interviews is consistent with our finding that 60% of the programs used a checklist and that 55.4% compiled score sheets during the interviews. A higher proportion of programs associated with a medical school used score sheets than programs that were not, and the interviewing bodies of medical school–associated programs had more members. The use of score sheets thus might be the response to the need for comparable data among candidates, the result of an increased number of candidates, progress toward standardization and accountability, or the reflection of a larger bureaucracy than that of their counterparts. Although more common in the Northeast, panel interviews are barely used (6.5% of all programs), a fact that might be seen to be in contrast to the movement toward standardization and the interest in developing a fair and unbiased process, given that panel interviews allow for various perspectives on the competencies required and each candidate’s qualifications and thus provide a more objective measurement of the candidates while somehow assuring fairness to candidates (30, 31). Once the applicants complete the application and the interview process, a decision for admission is taken. The decision is the responsibility of the interviewing body and the program director in 62.9% and 33.8% of programs, respectively. A significant number of respondents stated that a “gut feeling” or the right fit of candidates in the program was the single most important factor that determines admission. These statements might indicate either that subjective methods are still decisive in the selection
Academic Radiology, Vol 13, No 9, September 2006
KEY CRITERIA FOR SELECTION OF RADIOLOGY RESIDENTS
of candidates or that all the elements considered are so unified that cannot be derived from a single summation of the criteria considered. A final ranking that determines acceptance of candidates is produced and submitted to the NRMP. All the surveyed programs participate in this process. Programs are committed to filling their residency positions through the NRMP once they participate in the program. The NRMP is a private not-for-profit organization that aims to provide “an impartial venue for matching applicants’ and programs’ preferences for each other consistently” (32). A small minority of the programs (6.5%) is not able to fill all their posts through the Match and is allowed to choose other candidates outside of the process. It is important to highlight that applicant’s rank is not correlated with subsequent performance in rotations or ABR written examination (33). This study has several limitations: first, inherent to every survey, is the volunteer bias. Results may not be representative of reality, given that the participants may be fundamentally different from those who chose not to participate (34). Second, the information provided by the program directors may not be accurate enough to generalize and produce conclusions. Third, the distribution of the sample of this survey may underestimate some practices negatively affecting generalizations about the programs across the country. In conclusion, the preinterview selection criteria and interview structure for radiology residents in the United States are for the most part independent of the program’s size, region, or affiliation with a medical school. The radiology residency programs must encourage research to develop optimal preselection and interview processes. Program directors share the mandate of establishing and implementing formal written criteria and processes for the selection of residents (35), and results in that arena would allow them to implement the most appropriate ones. Moreover, given the increasing amount of knowledge to be taught, requirements to comply with, and higher quality standards to assure by the residency programs, organizations such as the Accreditation Council for Graduate Medical Education, Association of University Radiologists, Association of Program Directors in Radiology, and even the Radiological Society of North America should get more involved in the resident selection process and perhaps reconsider a proposal for a centralized processing facility under the auspices of an appropriate authority (14). The idea was proposed 15 years ago under the title
of a “radical proposal,” and it may still be both valid and radical. REFERENCES 1. Wood PS, Smith WL, Altmaier EM, Tarico VS, Franken EA Jr. A prospective study of cognitive and noncognitive selection criteria as predictors of residents performance. Invest Radiol 1990; 25(7):855– 859. 2. Longmaid HE. Residents recruitment. Acad Radiol 2003; 10(suppl 1): S4 –S9. 3. Boyse TD, Patterson SK, Cohan RH, et al. Does medical school performance predict radiology residents performance? Acad Radiol 2002; 9:437– 445. 4. Gunderman RB, Jackson VP. Are NBME examination scores useful in selecting radiology resident candidates? Acad Radiol 2000; 7:603– 606. 5. Friedman RB. N Engl J Med 1983; 308:651– 653. 6. Gong H Jr, Parker NH, Apgar FA, Shank C. Influence of the interview on ranking in the residency selection process. Med Educ 1984; 18:366 –369. 7. Altamaier EM, Smith WL, O’Halloran CM, Franken EA Jr. The predictive utility of behavior-based interviewing compared with traditional interviewing in the selection of radiology residents. Invest Radiol 1992; 27: 385–389. 8. Gardner B. A multivariate computer analysis of students performance as a predictor of performance as a surgical intern. J Surg Res 1972; 12:216 –219. 9. McCollister RJ. The use of part 1 National Board Scores in the selection of residents in ophthalmology and otorinolaryngology. JAMA 1988; 259:240 –242. 10. Gay SB, Hillman BJ, McNulty BC, Altamaier EM, Smith WL. Joseph E Whitley Award. The effect of preradiology clinical training on the performance of radiology residents. Invest Radiol 1993; 28:1090 –1094. 11. Tarico VS, Altmaier E, Smith WL, et al. Development and validation of an accomplishment interview for radiology residents. J Med Educ 1986; 61:845– 847. 12. Tarico VS, Smith WL, Altmaier E, et al. Critical incident interviewing in evaluation of residents’ performance. Radiology 1984; 152:327–329. 13. Vydareny KH. Editor’s note: The dilemma of residency selection. Investig Radiol 1992; 27:400. 14. Simon M. Radiology resident selection: A radical proposal. Investig Radiol 1992; 27:400 – 402. 15. Grantham JR. Radiology residents selection: Results of a survey. Invest Radiol 1993; 28:99 –101. 16. Available at http://www.practicesight.com. Accessed January 2006. 17. Available at http://www.surveymonkey.com. Accessed April 2006. 18. Bell JG, Kanellitsas I, Shaffer L. Selection of obstetrics and gynecology residents on the basis of medical school performance. Am J Obstet Gynecol 2002; 186:1091–1094. 19. Garden FH, Smith BS. Criteria for selection of physical medicine and rehabilitation residents. A survey of current practices and suggested changes. Am J Phys Med Rehabil 1989; 68:123–127. 20. McCaffrey JC. Medical student selection of otolaryngology-head and neck surgery as a specialty: Influences and attitudes. Otolaryngol Head Neck Surg 2005; 133:825– 830. 21. Taylor ML, Blue AV, Mainous AG, et al. The relationship between the National Board of Medical Examiners’ Prototype of the Step 2 Clinical Skills Exam and Interns Performance. Acad Med 2005; 85:496 –501. 22. Berstein AD, Jazrawi LM, Elbeshbeshy B, et al. An analysis of orthopedic residency selection criteria. Bull Hosp Joint Dis 2002–2003; 61(1–2): 49 –57. 23. Black KP, Abzug JM, Chinchilli VM. Orthopedic in-training examination scores: A correlation with USMLE results. J Bone Joint Surg Am 2006; 88:671– 676. 24. Carmichael KD, Westmoreland JB, Thomas JA, et al. Relation of residency selection factors to subsequent orthopedic in-training examination performance. South Med J 2005; 98:528 –532.
1159
OTERO ET AL
25. O’Halloran CM, Altamaier EM, Smith WL, Franken EA Jr. Evaluation of resident applicants by letters of recommendation: A comparison of traditional and behavior-based formats. Invest Radiology 1993; 28:274 –277. 26. Galazka SS, Kikano GE, Zyzanski S. Methods of recruiting and selecting residents for US family practice residencies. Acad Med 1994; 69: 304 –306. 27. Komivew E, Weiss ST, Rosa RM. The applicant interview as a predictor of resident performance. J Med Educ 1984; 59:425. 28. Chew FS, Ochoa ER, Relyea-Chew A. Spreadsheet application for radiology resident match rank list. Acad Radiol 2005; 12:379 –384. 29. Bandiera G, Regehr G. Reliability of a structure interview scoring instrument for a Canadian postgraduate emergency medicine training program. Acad Emerg Med 2004; 11(1):277–32.
1160
Academic Radiology, Vol 13, No 9, September 2006
30. Office of Human Resources, University of California, Berkeley. Guide to Managing Human Resources: A Resource for Managers and Supervisors. Available at http://hrweb.berkeley.edu. Accessed April 2006. 31. University of Hertfordshire. How to Get the Best From Your Interviews: A Quick Guide to Save and Effective Interviewing. Available at www. herts.ac.uk. Accessed April 2006. 32. Available at www.nrmp.org. Accessed April 2006. 33. Adusumilli S, Cohan RH, Marshall FW. How well does applicant rank order predict subsequent performance during radiology residency? Acad Radiol 2000; 7:635– 640. 34. Medical University of South Carolina. Bias. Available at http://www. musc.edu/dc/icrebm/bias.html. Accessed April 2006. 35. ACGME Institutional Requirements. Available at www.acgme.org/irc. Accessed April 2006.
Academic Radiology, Vol 13, No 9, September 2006
KEY CRITERIA FOR SELECTION OF RADIOLOGY RESIDENTS
1161
OTERO ET AL
1162
Academic Radiology, Vol 13, No 9, September 2006
Academic Radiology, Vol 13, No 9, September 2006
KEY CRITERIA FOR SELECTION OF RADIOLOGY RESIDENTS
1163
OTERO ET AL
1164
Academic Radiology, Vol 13, No 9, September 2006