ORIGINAL ARTICLE
Knowledge of the Costs of Diagnostic Imaging: A Survey of Physician Trainees at a Large Academic Medical Center Arvind Vijayasarathi, MD, MPH, MBA a , Richard Duszak Jr, MD a, Rondi B. Gelbard, MD b, Mark E. Mullins, MD, PhD a Abstract Purpose: To study the awareness of postgraduate physician trainees across a variety of specialties regarding the costs of common imaging examinations. Methods: During early 2016, we conducted an online survey of all 1,238 physicians enrolled in internships, residencies, and fellowships at a large academic medical center. Respondents were asked to estimate Medicare national average total allowable fees for five commonly performed examinations: two-view chest radiograph, contrast-enhanced CT abdomen and pelvis, unenhanced MRI lumbar spine, complete abdominal ultrasound, and unenhanced CT brain. Responses within 25% of published amounts were deemed correct. Respondents were also asked about specialty, postgraduate year of training, previous radiology education, and estimated number of imaging examinations ordered per week. Results: A total of 381 of 1,238 trainees returned complete surveys (30.8%). Across all five examinations, only 5.7% (109/1,905) of responses were within the correct 25% range. A total of 76.4% (291/381) of all respondents incorrectly estimated every examination’s cost. Estimation accuracy was not associated with number of imaging examinations ordered per week or year of training. There was no significant difference in cost estimation accuracy between those who participated in medical school radiology electives and those who did not (P ¼ .14). Only 17.5% of trainees considered their imaging cost knowledge adequate. Overall, 75.3% desire integration of cost data into clinical decision support and/or computerized physician order entry systems. Conclusions: Postgraduate physician trainees across all disciplines demonstrate limited awareness of the costs of commonly ordered imaging examinations. Targeted medical school education and integration of imaging cost information into clinical decision support / computerized physician order entry systems seems indicated. Key Words: Costs of imaging, resident and fellow education, medical student education, health care economics J Am Coll Radiol 2016;-:---. Copyright 2016 American College of Radiology
INTRODUCTION In an era of rising health care expenditures, physicians are increasingly expected to provide leadership in efforts to deliver high-quality and low-cost care [1,2]. Numerous prior studies focusing on laboratory tests, medications, diagnostic imaging examinations, and subspecialty consultations, however, have demonstrated
that physicians possess only a limited understanding of the costs of services they both order and provide [3-8]. Some of these services are relatively inexpensive, but others such as diagnostic imaging can carry substantial costs [9-11]. As physicians across a wide spectrum of specialties request imaging services for their patients— and in the academic setting many of these orders are
a Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, Georgia. b Department of Surgery, Emory University School of Medicine, Atlanta, Georgia. Corresponding author and reprints: Arvind Vijayasarathi, MD, MPH, MBA, Emory University School of Medicine, Department of Radiology and Imaging Sciences, Emory University Hospital, 1364 Clifton Road, NE,
Suite D125A, Atlanta, GA 30322; e-mail: Arvind.Vijayasarathi@Gmail. com. The authors have no conflicts of interest related to the material discussed in this article. Mark Mullins is a former president of the Alliance of Medical Student Educators in Radiology (AMSER), an organization whose work has been cited herein. An abstract based on this work was submitted to the Radiological Society of North America 2016 National Meeting.
ª 2016 American College of Radiology 1546-1440/16/$36.00 n http://dx.doi.org/10.1016/j.jacr.2016.05.009
1
placed by trainees [12]—it is reasonable to expect interns, residents, and fellows to possess a working understanding of the costs of diagnostic imaging examinations they commonly request. To date, the only large-sample-size study focusing on physician awareness of diagnostic imaging costs was restricted to a radiology trainee population [13]. No large-scale studies on the costs of imaging examinations have focused on a broader graduate medical educational trainee population. Such information could be helpful in developing, implementing, and improving appropriate medical school, residency, and fellowship educational programs for trainees and possibly for practicing physicians as well. Accordingly, the aim of our study was to investigate the awareness of postgraduate physician trainees across a variety of specialties regarding the costs of commonly performed diagnostic imaging examinations. Trainee awareness of the costs of common imaging examinations is hypothesized to be poor, particularly given the precedent in the literature, which currently demonstrates limited physician knowledge of health care costs in general [3-8].
METHODS This investigation was evaluated by Emory University’s Institutional Review Board and was granted exempt status before survey deployment. Study Population The Graduate Medical Education (GME) Office at Emory University School of Medicine maintains a database containing e-mail addresses of all physicians enrolled in all ACGME-accredited residency and fellowship programs. Our study population included all trainees in that database. Survey Instrument and Data Collection Our survey instrument was largely based upon a recently published survey distributed to a radiology traineee specific audience [13] that drew upon elements employed by Rock et al [6] and Graham et al [8] to query pediatricians and hospitalists, respectively, about the costs of health care services. That survey asked participants to estimate CMS Medicare Part B Physician Fee Schedule national average allowable fees for five commonly performed diagnostic imaging examinations representing several modalities. We adapted that prior instrument, which was intended for a radiology-specific audience, to apply to a broader 2
audience by asking about global allowable fees rather than more nuanced professional (ie, physician interpretive) and technical (ie, facility) fees. Based on prior studies, we used the Medicare national average allowable fee as a surrogate of the financial implications of these imaging examinations to society. This value reflects the actual dollar amount that changes hands for a given diagnostic imaging examination (total Medicare payments and patient co-insurance payments) in a large segment of the US population, and has been used as a surrogate of cost in prior published studies [13-15]. Within the medical literature, there is considerable variation in how “cost” is defined. True health care costs are often very challenging to ascertain, as they differ widely across institutions based on negotiated agreements between facilities and payers. As such, respondents were asked to specifically consider the national average Medicare allowable fee value when making their estimates, as this approximates the societal costs of these services. We asked respondents about the following demographic information: level and year of postgraduate training, training program specialty, estimated number of diagnostic imaging examinations requested per week, whether or not they have received formal education related to the cost of diagnostic imaging during medical school or postgraduate training, and whether they participated in a medical student radiology elective. Using a Likert scale of 1 to 5, we asked recipients to rate their perception of the adequacy of their awareness of the costs of diagnostic imaging and their interest in having costrelated information integrated into computerized physician order entry (CPOE) or clinical decision support (CDS) tools. We then asked them to provide their best estimate in US dollars of national average Medicare global allowable payments (taking into account both radiologist and facility components of fees) for five commonly performed examinations (two-view chest radiography, contrast-enhanced CT abdomen and pelvis, unenhanced CT brain, unenhanced MRI lumbar spine, and complete abdominal ultrasound). Our online survey was created and distributed using the web-based SurveyMonkey professional platform (SurveyMonkey LLC, Palo Alto, CA). An initial recruitment e-mail with a hyperlink to our survey was distributed via the GME e-mail distribution list to all subscribed trainees in mid January 2016. Reminder e-mails were sent through the GME office once per week for a total of four weeks, until the survey was closed in mid February 2016. Additionally, we sent unique Journal of the American College of Radiology Volume - n Number - n Month 2016
recruitment e-mails to individual program directors to inform them of our study and ask for their support in survey distribution. To incentivize participation, we offered respondents the opportunity to enroll in a raffle to win one of ten $50 Amazon gift cards (Amazon Inc, Seattle, WA), which were purchased with departmental research funds. Survey responses were anonymous. Incomplete surveys were excluded from analysis.
Examination Selection and Cost Data Selection of imaging examinations to survey and abstraction of cost data for those studies was identical to the methodology recently employed for a radiologyspecific audience [13]. Specifically, to optimize the ability to compare with prior work, we chose the same five imaging examinations for comparison (Table 1). Current Procedural Terminology codes for the five selected services were mapped to 2016 posted national
Table 1. Response rate and background characteristics Respondent Characteristics Total, n (%) 1,238 402 (32.5%) 21 (1.7%) 381 (30.8%) Number of Respondents (%) 78 (20.5%) 62 (16.3%) 80 (21.0%) 61 (16.0%) 59 (15.5%) 31 (8.1%) 8 (2.1%) 2 (0.5%) Number of Respondents Number of Trainees in Each Program Training Program (% of Total Respondents) (% Specialty Response Rate) Internal medicine & subspecialties 108 (28.3%) 335 (32.2%) Surgical specialties 57 (15.0%) 260 (21.9%) Radiology 50 (13.1%) 84 (59.5%) Pediatrics & subspecialties 39 (10.2%) 154 (25.3%) Emergency medicine 23 (6.0%) 74 (31.1%) Pathology 19 (5.0%) 49 (38.8%) Anesthesiology 17 (4.5%) 66 (25.8%) Transitional year 15 (3.9%) 29 (51.7%) Obstetrics & gynecology 17 (4.5%) 44 (38.6%) Psychiatry 11 (2.9%) 46 (23.9%) Physical medicine & rehabilitation 9 (2.4%) 21 (42.9%) Neurology 7 (1.8%) 44 (15.9%) Radiation oncology 5 (1.3%) 14 (35.7%) Dermatology 4 (1.0%) 18 (27.8%) Medical School & Residency Education Number of Respondents (%) Participated in radiology elective 227 (59.6%) Imaging cost education: preclinical years 41 (10.8%) Imaging cost education: clinical years 71 (18.6%) Imaging cost education: postgraduate 66 (17.3%) Number of Examinations Ordered per Week Number of Respondents (%) 0 87 (22.8%) 1-5 121 (31.8%) 6-10 81 (21.3%) >11 92 (24.1%) Survey Response Categories Total survey recipients Returned surveys Incomplete surveys Completed surveys Level of Training PGY 1 PGY 2 PGY 3 PGY 4 PGY 5 PGY 6 PGY 7 PGY 8
Note: PGY ¼ postgraduate year.
Journal of the American College of Radiology Vijayasarathi et al n Physician Trainee Knowledge of Imaging Costs
3
average allowable Medicare professional and technical facility fees [16]. These were used as the benchmark for correct costs of the surveyed imaging examinations.
Statistical Analysis Survey response data were summarized using standard descriptive statistics (mean, standard deviation, range, median) for continuous variables of cost estimates. Categorical variables (eg, postgraduate year [PGY], training program, and self-assessed awareness of costs) were summarized by frequencies. The statistical analysis we employed was largely derived from the methodology of a similar survey distributed to a radiology-specific audience [13]. The same predefined accuracy range of 25% of each total posted Medicare value was employed, and the absolute percentage estimation errors for each examination were calculated in a similar fashion. Individual and mean cost knowledge scores were calculated across all respondents, in keeping with the prior analysis. Multiple subgroups were analyzed individually. First, the overall cost knowledge scores of imaging trainees (radiology residents and fellows, nuclear medicine residents and fellows) were compared against nonradiologists (all other specialties). Additionally, cost knowledge scores of those reporting participation in a medical school radiology elective were compared with those who did not report elective participation. Similarly, respondents indicating any prior dedicated cost of imaging education (preclinical or clinical years of medical school, residency, fellowship) were compared against those who reported no cost-related education. Statistical significance of potential differences in subgroup average
cost knowledge scores was assessed using Student’s twosample t tests. The Pearson correlation coefficient (r) was calculated to assess potential association between the numbers of diagnostic imaging examinations respondents reported ordering per week and their mean absolute percentage error across all five examinations. Analysis of variance models were produced to determine any potential correlation between PGY status and the degree of estimation error across the five diagnostic imaging examinations. Data analysis was performed using Excel for Mac 2011 (Microsoft, Redmond, WA) and statistical analysis was performed using StatPlus:mac LE (AnalystSoft Inc, Alexandria, VA).
RESULTS Of all 1,238 trainees enrolled in internship, residency, and fellowship programs at Emory University School of Medicine, 381 physicians-in-training submitted completed surveys, for a response rate of 30.8%. The demographic characteristics of the study population are summarized in Table 1. Respondents were spread quite evenly from the PGY 1 to the PGY 5 level. Approximately half of respondents were residents (54.9%), with the remainder (45.1%) split closely between interns and fellows. Analyzing the study population as a whole demonstrates that only 5.7% (109/1,905) of estimates across all five diagnostic imaging examinations were correct (within 25% of actual Medicare allowable amounts) (Table 2). The percentage of correct responses by examination ranged from a low of 1.8% for the unenhanced MRI lumbar spine to a high of 10.2% for
Table 2. Cost estimates by imaging examination
Examination Two-view chest radiography Contrast-enhanced CT of the abdomen & pelvis Unenhanced CT of the brain Unenhanced MRI of the lumbar spine Complete abdominal ultrasound All examinations
4
Medicare Actual Allowable Total Fee (Whole Dollars) 31
Cost Estimates by Examination Mean Absolute Mean Estimate in Error in Dollars Dollars (SD) (%) 166.23 (141.19) 136.93 (441.7%)
342
1251.72 (1,876.11)
947.08 (276.9%)
128
1,115.42 (1,849.42)
989.43 (773%)
245
3,095.12 (13,071.99) 2,851.98 (1,164.1%)
Percentage of Percentage of Estimates Within Estimates Within 25% (n) 50% (n) 2.9 (11) 8.4 (32) 10.2 (39)
26 (99)
5.3 (20)
6 (23)
1.8 (7)
3.2 (12)
136
313.23 (249.37)
202.77 (149.1%)
8.4 (32)
38.6 (147)
-
-
-
5.7 (109)
16.4 (313)
Journal of the American College of Radiology Volume - n Number - n Month 2016
the contrast-enhanced CT abdomen and pelvis. Even when the definition of “correct” is expanded to a range of 50% above or below the Medicare value, only 16.4% of estimates across all five examinations fall into this range. When assessing each respondent’s performance using previously defined imaging cost knowledge scores, 76.4% (291/381) incorrectly estimated the cost of all five examinations (ie, cost knowledge score of 0). Only one respondent estimated the cost of more than two of five examinations correctly. The mean cost knowledge score across all respondents was 0.29/5. The degree of estimation of error for each examination is rather large, with mean absolute estimation errors ranging from 137% for two-view chest radiography to 2,852% for unenhanced MRI lumbar spine. Almost 87% (1,653/1,905) of all responses were incorrect overestimates, and 7.5% (143/1,905) were incorrect underestimates (Fig. 1). The degree of absolute estimation error across all five examinations was not associated with the number of examinations that the respondent requested per week (r ¼ 0.02), within the limits of their subjective response. Similarly, the PGY level of the trainee was not correlated with overall estimation accuracy (r ¼ -0.045). Nearly 60% of the study population (227/381) reported participating in a radiology elective during medical school. Two-thirds of all respondents (252/381) reported no previous formal education related to the costs of imaging during medical school or postgraduate training. There was no significant difference in cost knowledge
scores between those who participated in a medical school radiology elective and those who did not (P ¼ .14) or between those who reported no prior education related to imaging costs and those who did report prior education (P ¼ .77). Radiology trainees demonstrated a slightly higher rate of correct cost estimates, averaging 0.53 correct responses out of 5, compared with 0.23 correct responses out of 5 for noneradiology trainees (P ¼ 0.0034). Additionally, when comparing the non-radiology group in this study to the recent nationwide survey of radiology trainees performed by our group, there was a noticeable difference in overall percentage of correct responses: 5.7% and 17.1%, respectively [13]. Only 17.5% (67/381) of all respondents considered their imaging cost knowledge adequate. Along the same lines, the majority of respondents (75.3%; 287/381) desire integration of basic cost data into their CDS and/or CPOE systems.
DISCUSSION Trainee awareness of costs of commonly performed diagnostic imaging examinations was assessed via a single site survey at a large academic medical center with 1,238 interns, residents, and fellows. Garnering a 30.8% response rate, our survey demonstrates a striking knowledge gap with regard to common imaging costs, with 94.3% of all trainee estimates falling outside of the correct range (25% of Medicare allowable fees). Approximately three-fourths of respondents (76.4%) incorrectly
Fig 1. Cost estimates by examination type.
Journal of the American College of Radiology Vijayasarathi et al n Physician Trainee Knowledge of Imaging Costs
5
estimated the costs of all five diagnostic imaging examinations. Only one trainee was able to correctly estimate the cost of more than two of five examinations. The mean absolute percentage error across all examinations was 561%. Estimation accuracy was not correlated with number of imaging examinations ordered per week by trainees, nor was it associated with PGY level. Participation in a medical school radiology elective or receiving dedicated cost-related education did not lead to higher cost knowledge scores. Encouragingly, a knowledge gap awareness seems to exist: 82.5% of respondents considered their imaging cost knowledge inadequate, and threefourths of trainees support integration of cost information into CDS and/or CPOE tools. Our results both comport with and complement prior literature. Previous studies in the medical student and practicing physician populations have demonstrated significant knowledge gaps with regard to costs of imaging [3-8]. One recent study surveyed a much smaller number of exiting senior residents at a single time point regarding their knowledge of a few laboratory and diagnostic imaging examinations, which demonstrated an overall estimation accuracy rate of 25% [17]. Responses from 1,066 radiology-only trainees in a recent nationwide survey demonstrated a substantial knowledge gap in regard to the costs of commonly performed diagnostic imaging examinations, with only 17% of all responses across five examinations falling within 25% of Medicare allowable fees [13]. Although the lack of awareness among radiology trainees regarding the costs associated with their own specialty is concerning, it may be less relevant than the results of our analysis, because radiologists rarely order diagnostic imaging examinations and, as such, may exert less control over diagnostic imaging spending than physicians from other specialties. To our knowledge, no larger-scale study has specifically assessed the imaging cost awareness of physician trainees across all specialties and years of postgraduate training. Many opportunities for improved education and training seem to exist. In a recent multicenter survey of fourth-year medical students, significant knowledge deficits were identified in the areas of costs of imaging, radiation risk knowledge, appropriateness of imaging examinations, and patient safety considerations [18]. These findings suggest that graduating medical students may be unprepared to order the right imaging examinations for their patients in the safest, most appropriate, and most judicious manner. Theoretically, trainees across a variety of fields could gain these 6
requisite skills over time via focused teaching, interactions with radiologists, and self-study. However, the opportunity for physicians of all specialties to develop a strong educational foundation in pertinent radiologyrelated areas begins during medical school. The Alliance of Medical Student Educators in Radiology National Medical Student Curriculum in Radiology offers a structured framework that can be implemented longitudinally throughout medical school, covering the aforementioned material essential for providers to care for their patients [18,19]. Complementing this type of medical school educational initiative is the opportunity to integrate cost-related information into emerging CDS and/or CPOE platforms. This option is particularly prescient, given recently passed legislation mandating CDS consultation for Medicare reimbursement of advanced diagnostic imaging examinations [20]. There are several limitations to our study. This survey was administered at a single academic medical center, which potentially limits the external applicability of our findings to other practice settings or geographic locations. Although our study population was restricted to trainees, it is possible that the cost awareness of practicing physicians is not significantly different [3-8]. Also, there is the potential for nonresponder bias, given an overall completed response rate of 30.8%, which is a limitation common to most voluntary surveys. Along the same lines, the voluntary nature of the study could have resulted in a skewed sample, as trainees with a specific interest in imaging and/or health care economics could have self-selected to participate. Similar to prior studies [13-15,17], we used CMS Medicare Part B Physician Fee Schedule National Average Allowable fees as a surrogate for true cost to the health care system at large. Although we explicitly instructed respondents to take this into consideration when making their estimate, given the preponderance of substantial overestimates, it is possible that many trainees made their estimates based on hospital charges or list prices of diagnostic imaging examinations rather than actual payments for those services. In conclusion, trainees across a variety of specialties in our health care system demonstrated a very limited awareness of costs associated with commonly performed diagnostic imaging examinations. The combination of early focused imaging cost education and ongoing access to relative costs through clinical decision support systems could, it is hoped, afford providers both the perspective and necessary information in real time to help them choose the highest-value appropriate examination for their patients. Journal of the American College of Radiology Volume - n Number - n Month 2016
TAKE-HOME POINTS -
-
-
Physician trainees demonstrate limited awareness of the costs of commonly performed diagnostic imaging examinations. Focused cost-related education at the medical school level is needed to prepare trainees to identify the highest-value imaging examinations appropriate for their patients. Integration of cost information into clinical decision support tools may help physicians become lifelong stewards of high-value imaging.
REFERENCES 1. Keehan SP, Cuckler GA, Sisko AM, et al. National health expenditure projections, 2014-24: Spending growth faster than recent trends. Health Aff (Millwood) 2015;34:1407-17. 2. Tilburt JC, Wynia MK, Sheeler RD, et al. Views of US physicians about controlling health care costs. JAMA 2013;310:380-9. 3. Okike K, O’Toole RO, Pollak AN, et al. Survey finds few orthopedic surgeons know the costs of the devices they implant. Health Aff (Millwood) 2014;33:103-9. 4. Broadwater-Hollfied C, Gren LH, Porucznik CA, Youngquist ST, Sundwall DN, Madsen TE. Emergency physician knowledge of reimbursement rates associated with emergency medical care. Am J Emerg Med 2014;32:498-506. 5. Allan MG, Lexchin J. Physician awareness of diagnostic and nondrug therapeutic costs: a systematic review. Int J Technol Assess Health Care 2008;24:158-65. 6. Rock TA, Rui X, Fieldston E. General pediatric attending physicians’ and residents’ knowledge of inpatient hospital finances. Pediatrics 2013;131:1072-80. 7. Sehgal R, Gorman P. Internal medicine physicians’ knowledge of healthcare charges. J Grad Med Educ 2011;3:182-7.
8. Graham JD, Potyk D, Raimi E. Hospitalists’ awareness of patient charges associated with inpatient care. J Hosp Med 2010;5:295-7. 9. Lee DW, Levy F. The sharp slowdown in growth of medical imaging: an early analysis suggests combination of policies was the cause. Health Aff (Millwood) 2012;31:1876-84. 10. Smith-Bindman R. Rising use of diagnostic medical imaging in a large integrated health system. Health Aff (Millwood) 2008;27:1491-502. 11. Doodoo M, Duszak R, Hughes DR. Trends in the utilization of medical imaging from 2003 to 2011: clinical encounters offer a complementary patient-centered focus. J Am Coll Radiol 2013;10:507-12. 12. Iwashyna TJ, Fuld A, Asch DA, Bellini LM. The impact of residents, interns, and attendings on inpatient laboratory ordering patterns: a report from one university’s hospitalist service. Acad Med 2011;86: 139-45. 13. Vijayasarathi A, Hawkins CM, Hughes DR, Mullins ME, Duszak R Jr. How much do common imaging studies cost? A nationwide survey of radiology trainees. AJR Am J Roentgenol 2015;205:1-7. 14. Pickhardt PJ, Hassan C, Laghi A, Kim DH. CT colonography to screen for colorectal cancer and aortic aneurysm in the Medicare population: cost-effectiveness analysis. AJR Am J Roentgenol 2009;192:1332-40. 15. Carlos RC, Axelrod DA, Ellis JH, Abrahamse PH, Fendrick AM. Incorporating patient-centered outcomes in the analysis of costeffectiveness: imaging strategies for renovascular hypertension. AJR Am J Roentgenol 2003;181:1653-61. 16. Centers for Medicare and Medicaid Services. National Physician Fee Schedule Search. Available at: http://www.Medicare.gov/apps/physicianfee-schedule/search/search-criteria.aspx. Accessed June 9, 2014. 17. Long T, Silvestri MT, Dashevsky M, Halim A, Fogerty RL. Exit survey of senior residents: cost conscious but uninformed. J Grad Med Educ 2016;8(2):248-51. 18. Prezzia C, Vorona G, Greenspan R. Fourth-year medical student opinions and basic knowledge regarding the field of radiology. Acad Radiol 2013;20:272-83. 19. AMSER National Medical Student Curriculum in Radiology. Available at: http://www.aur.org/Affiliated_Societies/AMSER/amser_ curriculum.cfm. Accessed March 3, 2016. 20. Protecting Access to Medicare Act of 2014, Pub L No 113-93,128 Stat 1040. 113th Congress, H.R. 4302, 2014.
Journal of the American College of Radiology Vijayasarathi et al n Physician Trainee Knowledge of Imaging Costs
7