Clinical Imaging 39 (2015) 334–338
Contents lists available at ScienceDirect
Clinical Imaging journal homepage: http://www.clinicalimaging.org
The use of ACR Appropriateness Criteria: a survey of radiology residents and program directors☆,☆☆,★ Daniel K. Powell a,⁎, James E. Silberzweig b a b
New York Presbyterian-Columbia Campus, Department of Radiology, 622 West 168th Street, PB1-301, New York, NY 10032 Mount Sinai Beth Israel, Department of Radiology, 16th Street @ 1st Avenue, 2 Karpas, New York, NY 10003
a r t i c l e
i n f o
Article history: Received 2 August 2014 Received in revised form 3 October 2014 Accepted 20 October 2014 Keywords: Appropriateness criteria Decision support Diagnostic Radiology Milestones Project Resident education
a b s t r a c t Purpose: Assess the utilization of American College of Radiology Appropriateness Criteria (ACR-AC) among radiology residency program directors (PDs) and residents. Methods: Radiology PD and resident survey. Results: Seventy-four percent (46/62) of PDs promote ACR-AC in education (Pb.05), and 84% (317/376) of residents have read at least a few (Pb .05). Seventy-four percent (74/100) of first-year residents compared to 56.8% (157/276) of second- to fourth-year residents report at least occasional faculty reference of ACR-AC (Pb.05). ACR-AC are well regarded (Pb .05), but 40% believe that they are perplexing. Conclusion: There is widespread resident awareness of ACR-AC and integration into resident training. However, faculty are only beginning to teach with them, and radiologists are not citing them with clinicians. © 2015 Elsevier Inc. All rights reserved.
1. Introduction The American College of Radiology Appropriateness Criteria (ACR-AC) “are evidence-based guidelines to assist referring physicians and other providers in making the most appropriate imaging or treatment decision for a specific clinical condition” [1]. They were developed in the 1990s in response to concerns about unnecessary utilization of health care resources and wide regional variations in practice [2]. Subsequently, the ACR-AC were promoted for clinical decision support [3], being encoded for incorporation into radiology information systems, searchable databases [4], and electronic order entry systems [2]. Multiple studies have demonstrated their ability to increase the percentage of appropriate imaging examinations [2]. Moreover, the Protecting Access to Medicare Act of 2014 will require consultation of some appropriateness criteria decision support tools (not specifically the ACR-AC) before prescribing advanced medical imaging, which will go into effect in 2017 [5]. The ACR-AC are also considered valuable learning tools for trainees [6]. The Accreditation Council on Graduate Medical Education (ACGME) and the American Board of Radiology (ABR) consider the routine use of ACR-AC as the most basic achievement of consultant competency by residents [7]. This requirement was recently introduced in the Diagnostic
☆ No funding was provided for this study. ☆☆ The authors have no conflicts of interest or significant financial disclosures. ★ This material has not been published or presented elsewhere. ⁎ Corresponding author. New York Presbyterian – Columbia Campus, Department of Radiology, 622 West 168th Street, PB1-301, New York, NY 10032. Tel.: +1 212 305 7094; fax: +1 212 305 8177. E-mail address:
[email protected] (D.K. Powell). http://dx.doi.org/10.1016/j.clinimag.2014.10.011 0899-7071/© 2015 Elsevier Inc. All rights reserved.
Radiology Milestones Project for use in resident self-assessment [7], part of an outcomes-based evaluation of resident training and competency [8]. Early studies show poor use amongst radiologists, for instance, with only 30% of surveyed musculoskeletal radiologists using them [9]. Moreover, less than 2% of surveyed clinicians reported using them as a first source for selecting the best imaging technique, and less than 1% reported using them as a second source [10]. Perceptions among radiology program directors (PDs) and residents about the ACR-AC may provide valuable information about the obstacles and improvement areas in the implementation and expansion of ACR-AC education. Therefore, we sought to assess radiology PDs’ and residents’ attitudes toward the ACR-AC and how they are being formally and informally incorporated into resident education. We hypothesize that they are largely well respected but underutilized.
2. Subjects and methods The present study was approved by the local institutional review board. We sent an email requesting survey participation to 340 PDs, assistant PDs, and associate PDs from a database managed by the Association of Program Directors in Radiology (APDR), excluding program coordinators. The APDR could not specify the exact number of programs included in this list, and we estimated approximately that 186 programs [11] were contacted with approximately 4652 residents [11] based on prior surveys. A question was included asking respondents for the name of their institution to avoid duplication of results from PDs (with associate or assistant PDs at the same institution) and to compare
D.K. Powell, J.E. Silberzweig / Clinical Imaging 39 (2015) 334–338
residents at the same institutions. The survey did not differentiate variations of PD responsibility. The email contained a link to a SurveyMonkey.com survey for the PDs to complete. In addition, it contained a link to a separate survey for distribution to residents by the PDs. (The survey was not sent directly to residents by the authors). The email was sent at the beginning of the final third of the academic year to ensure first-year resident (R1) education experience. We sent an email reminder after 1 week and maintained the survey open for 1 month, after which time responses had tapered down to none over the last week. Fisher’s Exact Tests were used to analyze categorical data by grouping responses into binomial categories. The one-sample t test was also used to evaluate response rates in subgroups. Statistical analysis was limited for a few multiple-response questions as these variables were not independent. The PD survey consisted of 10 questions, and the resident survey consisted of 9 questions. Both surveys addressed level of familiarity with the ACR-AC, opinions about the ACR-AC, use of the ACR-AC when justifying study recommendations to referring clinicians, and referring clinician opinions about the ACR-AC. The resident survey also asked about frequency of use of ACR-AC by faculty in lectures and clinical rotations and requirements for their use in education at their program. The PD survey asked about faculty teaching requirements for ACR-AC use, practice quality improvement (PQI) projects relating to ACR-AC, and if PDs disagree with specific ACR-AC. The last question of each survey asked for a specific program affiliation.
3. Results 3.1. Program directors Out of the 186 programs contacted, 62 PDs responded (33.3%). A significant 71% majority (44/62) was from university programs, with 16.1% (10/62) from university-affiliated versus 12.9% (8/62) from community programs (Pb.0001). When the eight PDs from community programs were separated out (and subanalyses were performed based on years of practice or subspecialty), the group numbers were small, and it was difficult to assess if the differences were meaningful. Three respondents declined to list their institution, and redundant responses from officials with shared PD responsibility cannot be excluded. Similarly, two survey responses came from institutions that share elements of their names and were assumed to come from different affiliates. When these five responses were removed, responses were not significantly different. A significant 74.2% majority (46/62) of PDs reported that they suggest or require use of the ACR-AC in radiology resident education (Pb .0001) (Table 1). One reported a first-year program that requires navigation of and familiarization with the ACR-AC in order to complete regular quizzes. Another PD assigns them to residents as reading.
335
Moreover, a third (5/16) of the 25.8% (16/62) who are not requiring or suggesting their use in education plan to do so. PQI projects based on ACR-AC resident education have already been performed by 30.6% (19/62) of the PDs, while 19.4% (12/62) are planning projects and 50% (31/62) are not. For instance, one project is tracking ordering patterns after presenting ACR-AC to clinical house staff and placement of the ACR-AC logo on all hospital computers. PD opinions about the ACR-AC were largely positive: 45.2% (28/62) of PDs consider the ACR-AC an “excellent resource” (Table 2). However, 38.7% (24/62) report that they would be improved with a decision-tree diagram, 30.6% (19/62) find them helpful but hard to digest, and 21% (13/62) believe that they would be improved by grading the quality of supportive evidence. Interestingly, if a PD reported the ACR-AC as an excellent resource, the most likely additional comment was that they could be improved by a decision tree (although more than half of these responses were not accompanied by another qualifying opinion). The 77.1% (84/109) positive total responses (summing the first four choices, including areas for improvement) were significantly greater than the 22.9% (25/109) negative ones (last three choices) in this multiresponse question (Pb.0001). Despite this high regard, 48.4% (30/62) of respondents rarely or never refer to ACR-AC when recommending studies to referring clinicians (20/62 and 10/62 respondents, respectively), and 40.3% (25/62) occasionally do so. On the other hand, only 11.3% (7/62) routinely do so (Pb.0001). Interestingly, a significant 79.2% (38/48) of those referencing the ACR-AC in discussions with referrers reported positive or neutral attitudes from clinicians (11/48 and 27/48 respondents, respectively) rather than a dismissive attitude (Pb.0001). Unfortunately, 33.9% (21/62) of PDs reported not being familiar enough with the ACR-AC to say if they disagree with them. None of the other respondents reported strong disagreement with any ACR-AC, and a significant 65.9% (27/41) do not disagree at all with any ACR-AC (P= .008). However, 4.9% (2/41) disagree moderately with specific ACR-AC, and 29.3% (12/41) do so mildly. 3.2. Residents Out of the approximately 4652 radiology residents in the United States, 376 responded (8.1%). However, as we relied upon PDs to distribute surveys, we might assume that only the 33.3% of PDs who responded distributed the survey to their residents, and we estimate that our resident response rate was closer to 24% (376/1549) by crudely making our denominator the estimated number of surveyed residents or 33.3% of the total estimated number of radiology residents. There was an equal distribution between training years: 22.6% (100/376),
Table 2 Program director survey question 3 What is your opinion of the ACR-AC (check all that apply)?
Table 1 Program director survey question 1 Have you suggested or required use of ACR-AC for resident education (check all that apply)? Answer options
Response percent
Response count
Yes, in faculty lectures Yes, to be emphasized during clinician consultation Yes, to be emphasized during interdisciplinary conferences Yes, in presentations given by resident Yes, in radiology reports Yes, other (please describe) No No, but we are planning to do so
56.5% 41.9%
35 26
33.9%
21
22.6% 9.7% 6.5% 17.7% 8.1% Answered question
14 6 4 11 5 62
Answer options
Response percent
Response count
Excellent resource for appropriate imaging decision support Would be improved by providing a decision-tree diagram Helpful but hard to digest Would be improved by grading the quality of supportive evidence Consensus document, rather than evidence-based medicine Confusing and inaccessible Tool for administrative analysis of ordering compliance, not a clinical tool I disagree with some of the ACR-AC documents (comment below)
45.2%
28
38.7%
24
30.6% 21.0%
19 13
14.5%
9
12.9% 9.7%
8 6
3.2%
2
Answered question
62
336
D.K. Powell, J.E. Silberzweig / Clinical Imaging 39 (2015) 334–338
first year; 29.3% (110/376), second year; 22.1% (83/376), third year; and 22.1% (83/376), fourth year (SD 13.3, P=.0008). A significant majority of residents report familiarity with the ACR-AC, 84.3% (317/376) having read at least a few ACR-AC [32.7% (123/376) multiple and 51.6% (194/376) a few] versus only 11.2% (42/376) having read none or are not being aware of their existence (Pb .0001). However, significantly, only 14.1% (53/376) reported that ACR-AC are routinely referenced by faculty during lecture, with 47.3% (178/376) reporting occasional reference versus 30.3% (114/376) rare reference and 9.3% (35/376) no reference (Pb.0001). One resident commented that junior faculty members are much more likely to include the ACRAC in lectures. When the 100 first-year residents (R1s) were separated from the rest of the residents [second to fourth year residents (R2–4s)], there were significant differences in familiarity with the ACR-AC (Table 3) and frequency with which faculty referenced the ACR-AC in lectures (Table 4). For instance, R1s are less likely to have read ACR-AC: 79% (79/100) of R1s have read at least one ACR-AC compared to 92.4% 276) of R2–4s/276) of R2–4s (P=.0006) (Table 3). However, 74% (74/ 100) of R1s compared to 56.8% (157/276) R2–4s report at least occasional reference to the ACR-AC in faculty lectures, and 17% (17/100) of R1s versus 35.1% (97/276) of R2–4s report rare reference to them (P=.0029), suggesting that there has been increased formal teaching of the ACR-AC just this year (Table 4). Faculty reference of the ACR-AC on clinical rotation was no different than in lecture [only 10.4% (39/376) routinely and 43.9% (165/376) occasionally versus 33.8% (127/376) rarely and 12.8% (48/376) never referencing them] (P= .02). In fact, a significant 88.6% (333/376) of residents reported not being aware of any requirements specifically from their program for use of the ACR-AC in their education (Pb.0001). Among the 11.4% (43/376) who were, specific requirements reported included wording of the goals and objectives, a dedicated lecture series, and quizzes. Resident opinions about the ACR-AC (Table 5) were very similar to those of the PDs: 48.4% (182/376) consider them an excellent resource, 40.2% (151/376) believe that they would be improved by a decision-tree diagram, and 29.5% (111/376) find them helpful but hard to digest. Interestingly, only 11.7% (44/376) of residents, compared to 21% (13/62) of PDs, believe that ACR-AC would be improved by grading the quality of supportive evidence; however, this difference was not significant. Disregarding the 43 residents who had no exposure to the ACR-AC, the 80.5% (488/606) positive opinions (first four choices) were significantly greater than the 19.5% (118/606) negative ones (last three choices) in this multiresponse question (Pb.0001). Residents reference ACR-AC when recommending studies to referrers at similar rates as PDs, with 49.2% (185/376) rarely or never and 41% (154/376) occasionally referencing them versus only 9.8% (37/376) routinely doing so (Pb .0001). As with the PDs, a significant 74.3% (214/288) of residents (who reference ACR-AC) reported positive or neutral referring clinician attitudes about the ACR-AC (66/288 and 148/288 respondents, respectively) rather than a dismissive attitude (Pb.0001). No other significant differences in responses were elicited when separating residents by year, nor were there enough resident
Table 4 Resident survey question 2 with R1 responses separated from the rest Are the ACR-AC referenced during lectures at your institution? Answer options
Response percent (R1)
Routinely Occasionally Rarely Never Comments:
17.0% 57.0% 17.0% 10.0% 1 Answered question
Response count 17 57 17 10
Response percent (R2–4)
Response count
13.0% 43.8% 35.1% 9.1% 4
36 121 97 25
100
276
respondents from single programs to find differences in response patterns based on training institution. Moreover, there were no significant differences in resident responses compared to those of PDs. 4. Discussion Coinciding with a mandate for the use of decision support for ordering medical imaging [5], familiarity with the ACR-AC was made a requirement of radiology resident education [8]. Our results suggest widespread support for this didactic role among radiology residents and PDs, with 74% of PDs requiring or suggesting the use of ACR-AC in resident education and 84% of residents having read multiple or a few criteria. However, there are no guidelines to date about how ACR-AC should be taught and reinforced. The general hope that enforcing use of the ACR-AC will increase their value is logical and consistent with a study of medical students showing that exposure to the ACR-AC readily increased their interest and encouraged future use [12]. Our self-reported data suggest that there has been improvement in radiology resident familiarity with the ACR-AC when compared to a 2010 report wherein only 60% of tested radiology residents knew how to locate ACR-AC for imaging support [13] and a 2012 report showing a 65% correct response rate among residents questioned about thoracic imaging ACR-AC [14]. However, our suspicion that ACR-AC are still underutilized was also borne out, and greater emphasis on their use and better guidance on how to teach with them appear to be needed. Somewhat distressingly, 34% of PDs reported not being familiar enough with ACR-AC to comment on them. Moreover, only 10%–14% of residents reported routine faculty reference to ACR-AC in either lectures or clinical rotations, respectively, and 9%–13% reported never hearing them referenced. However, our data also suggest that mere regulatory emphasis on the ACRAC may be changing this trend. First-year radiology resident responses suggest a significant recent increase in the degree of faculty lecturing on the ACR-AC, three quarters of PDs are promoting ACR-AC use in education, and half of PDs are conducting PQI studies on ACR-AC. Thus, a more robust centralized effort to disseminate and make accessible the ACR-AC will likely be successful. Two useful local strategies to increase exposure were reported: the integration of ACR-AC as components of the electronic image protocoling and order entry systems (increasing exposure on the radiology and referrer sides) as well as a first-year radiology-resident directed navigation program, assigning ACR-AC readings in order to complete
Table 3 Resident survey question 1 with R1 responses separated from the rest What is your level of familiarity with the ACR-AC? Answer options
Response percent (R1)
Response count
Response percent (R2–4)
Response count
I have read and used multiple criteria I have read a few I have read one I am aware of their existence, but I have never read them I am not aware of the ACR-AC
27.0% 47.0% 5.0% 20.0% 1.0% Answered question
27 47 5 20 1 100
34.8% 53.3% 4.3% 7.6% 0.0%
96 147 12 21 0 276
D.K. Powell, J.E. Silberzweig / Clinical Imaging 39 (2015) 334–338 Table 5 Resident survey question 5 What is your opinion of the ACR-AC (check all that apply)? Answer options
Response percent
Response count
Excellent resource for appropriate imaging decision support Would be improved by providing a decision-tree diagram Helpful but hard to digest Consensus document, rather than evidence-based medicine Would be improved by grading the quality of supportive evidence Tool for administrative analysis of ordering compliance, not a clinical tool Confusing and inaccessible No/limited experience with them
48.4%
182
40.2%
151
29.5% 13.3%
111 50
11.7%
44
9.8%
37
8.2% 11.4% Answered question
31 43 376
337
to multiple answer choices, which allowed us to collect more data but limited interpretation. Response data were often considered in retrospectively assigned binomial categorical groupings when tested statistically. However, for certain response sets, the question of chance or statistical error was not relevant, and the qualitative nature of the data is supported by the numerical content, e.g., that half of PDs are performing PQI study with ACR-AC carries no questions of chance or type I error but seems more meaningful than 10% or 0% doing so. Additional limitations include a possible responder selection bias, limited distribution to community training programs, and a purposefully short survey design (to elicit the greatest response rate), omitting questions on methods of teaching, justifications for limited formal education, and means for improvement. We relied on indirect reporting and perceptions for certain data, such as for frequency of faculty teaching of ACR-AC and referring clinicians’ attitudes toward ACR-AC. We did not question residents and PDs about use of other decision support guidelines or ACR-AC incorporation into their ordering systems. 5. Conclusion
quizzes. We recommend that faculty take it upon themselves to routinely reference the ACR-AC with residents and in referrer consultation. While PDs are encouraged to create dedicated education for first-year resident introduction to the ACR-AC, we would encourage the ACR to create a learning module in addition to their “interactive view” [1] for residents. Mandated use of appropriateness criteria for decision support by clinicians [5] will change ordering practices, but we have an opportunity to make that transition smooth and encourage efficient imaging utilization with the ACR-AC. Prior studies indicate how much room there is for change. For instance, in one study, pediatric residents who were asked to select appropriate imaging performed poorly and indicated suboptimal reliance on radiology consultation [15]. Unfortunately, our results indicate that while some institutions are actively encouraging ACR-AC decision support through lectures, disseminated graphical reminders, and incorporation into order entry systems, a minority of PDs and residents explicitly refer to ACR-AC when recommending imaging to referrers. Moreover, in our study, greater than 20% of referrers have dismissive attitudes and over 50% have neutral attitudes about the ACR-AC, despite having representatives from 20 specialty societies on the Expert Panels that author the criteria [16]. This may represent a missed opportunity to influence clinicians to use and support these guidelines. The ACR-AC have been criticized for not emphasizing evidencebased practice over expertise, lack of strict criteria to grade the quality of supporting literature, and overemphasis on a consensus approach [17]. Moreover, they are criticized for lacking “user friendly output,” a clear hierarchichal temporal ordering of imaging choices, or recommending a single examination and imaging alternative [17]. These criticisms of the ACR-AC were of variable importance to our survey respondents. Overall, ACR-AC are well regarded: just under half of residents and PDs consider the ACR-AC to be an excellent resource. However, about a third of each group find them hard to digest, two out of five believe that they would be improved with a decision-tree diagram, and approximately 10%–20% were interested in grading of the quality of supportive evidence. Going forward, PDs who see areas for improvement in the ACR-AC might include residents in educational research projects, for instance, assigning groups to design a decision tree for a given AC or addressing instances where there is unsatisfactory evidence or disparity with specialty society guidelines. Our study is limited by a relatively low response rates (approximately 8% of residents and 33.3% of PDs). However, we cannot be certain that the survey reached all radiology residents because we relied on PDs to distribute surveys to the residents (thus, our resident response rate is likely closer to 24%, assuming distribution to residents by 33.3% of PDs), we do not know the exact number of programs contacted, and we relied on outside data to estimate the number of programs and residents available for survey. Some of our data were not appropriate for statistical analysis due
The vast majority of residents are aware of the ACR-AC, and PDs are integrating them into resident education. Unfortunately, few formal systems are in place, and most faculty members do not appear to be emphasizing them in lecture or clinical teaching. However, an encouraging, small, but significant increase in faculty teaching of ACR-AC over the past year was noted by R1s, which may in part stem from a recent emphasis on their importance by the ABR/ACGME. Program directors, faculty, and the ACR are all encouraged to continue efforts to increase exposure and accessibility of the AC. About a third of respondents believe that the ACR-AC could be made more digestible, in particular with a decision-tree diagram or improved searchability and indexing. Moreover, only half of PDs and residents ever explicitly reference ACR-AC during clinician consultation, which arguably is the most important informal opportunity to support the upcoming required transition to the use of appropriateness criteria when ordering imaging (set to take effect in 2017). Recruiting clinician support for ACR-AC could influence which sets of criteria the Secretary of Health and Human Services will sanction for this role in 2015 [5] as well as how seriously this requirement will be taken by referrers.
References [1] ACR Appropriateness Criteria®. Available at: http://www.acr.org/Quality-Safety/ Appropriateness-Criteria. [Accessed June 27, 2014]. [2] Sistrom CL, American College of Radiology. In support of the ACR Appropriateness Criteria. J Am Coll Radiol 2008;5:630–5 [discussion636-7]. [3] Allen B. Five reasons radiologist should embrace clinical decision support for diagnostic imaging. J Am Coll Radiol 2014;11:533–4. [4] Sistrom CL, Honeyman JC. Relational data model for the American College of Radiology Appropriateness Criteria. J Digit Imaging 2002;15:216–25. [5] H.R. 4302: protecting access to Medicare Act of 2014. Available at: https://www. govtrack.us/congress/bills/113/hr4302/text. [Accessed May 13, 2014]. [6] Mainiero MB, Incorporating ACR. Practice guidelines, technical standards, and appropriateness criteria into resident education. J Am Coll Radiol 2004;1: 277–9. [7] Vydareny KH, Amis Jr ES, Becker GJ, Borgstede JP, Bulas DI, Collins J, Davis LP, Gould JE, Itri J, Laberge JM, Meyer L, Mezwa DG, Morin RL, Nestler SP, Zimmerman R. Diagnostic Radiology Milestones. J Grad Med Educ 2013;5(Suppl. 1):74–8. [8] Swing SR, Beeson MS, Carraccio C, Coburn M, Iobst W, Selden NR, Stern PJ, Vydareny K. Educational milestone development in the first 7 specialties to enter the next accreditation system. J Grad Med Educ 2013;5:98–106. [9] Tigges S, Sutherland D, Manaster BJ. Do radiologists use the American College of Radiology Musculoskeletal Appropriateness Criteria? AJR Am J Roentgenol 2000; 175:545–7. [10] Bautista AB, Burgos A, Nickel BJ, Yoon JJ, Tilara AA, Amorosa JK. American College of Radiology Appropriateness. Do clinicians use the American College of Radiology Appropriateness Criteria in the management of their patients? AJR Am J Roentgenol 2009;192:1581–5. [11] Accreditation Council for Graduate Medical Education data resource book academic year 2012–2013. Available at: https://www.acgme.org/acgmeweb/Portals/0/ PFAssets/PublicationsBooks/2012-2013_ACGME_DATABOOK_DOCUMENT_Final. pdf. [Accessed on June 27, 2014].
338
D.K. Powell, J.E. Silberzweig / Clinical Imaging 39 (2015) 334–338
[12] Dillon JE, Slanetz PJ. Teaching evidence-based imaging in the radiology clerkship using the ACR appropriateness criteria. Acad Radiol 2010;17(7):912–6. [13] Logie CI, Smith SE, Nagy P. Evaluation of resident familiarity and utilization of the ACR musculoskeletal study appropriateness criteria in the context of medical decision support. Acad Radiol 2010;17:251–4. [14] Chiunda AB, Mohammed TL. Knowledge of ACR thoracic imaging Appropriateness Criteria® among trainees: one institution's experience. Acad Radiol 2012;19:635–9.
[15] Hirschl DA, Ruzal-Shapiro C, Taragin BH. Online survey of radiologic ordering practices by pediatric trainees. J Am Coll Radiol 2010;7:360–3. [16] Medical specialty organizations with representatives on the ACR Appropriateness Criteria® Expert Panels. Available at: http://www.acr.org/~/media/ACR/Documents/ AppCriteria/MedicalSpecialtyOrgRep.pdf. Accessed June 27, 2014. [17] Blackmore CC, Medina LS. Evidence-based radiology and the ACR Appropriateness Criteria. J Am Coll Radiol 2006;3:505–9.