ORIGINAL ARTICLE
Clinical Access and Utilization of Reports and Images in Neuroradiology Matthew D. Alvin, MD, MBA, MS, MA a , Mona Shahriari, MD a, Evan Honig b, Li Liu, MS a, David M. Yousem, MD, MBA a Abstract Background: The radiology report serves as the primary means of communication between radiologist and clinician. However, the value clinicians place on imaging and reports is variable, with many images of studies or their reports never being viewed. This has implications on the perceived value of the radiologist in the imaging chain. We hypothesized that neurologists, neurosurgeons, and otolaryngologists would view neuroradiology images most frequently and neuroradiology reports least frequently of all medical specialties. Materials and Methods: Ordering data were collected on all neuroradiology studies over a 1-month period. Imaging study date and time stamps were obtained for (1) when imaging study orders were placed, (2) when the patient underwent the imaging study, (3) when the imaging studies were viewed, and (4) when the radiology reports were accessed and by whom. Each data point included provider names, locations, departments, and level of training. Results: There were 7,438 imaging neuroradiology studies ordered. Overall, 85.7% (6,372) of reports and 53.2% (3,956) of imaging studies were viewed and 13.1% (977) of studies had neither images nor reports viewed. Inpatient neurosurgeons and neurologists viewed both imaging and reports significantly more than primary care specialties (P < .001). In the outpatient setting, this trend stayed true for neurosurgeons though was not true for neurologists (P < .001). Outpatient study imaging and reports were both viewed the least (48.6%), and inpatient study reports were viewed the most (95.2%; P < .001). Conclusion: Viewing of imaging and reports varies with neurosurgeons viewing neuroradiology studies more than all other medical specialties. Overall, the reports were viewed significantly more than the images, suggesting that the radiologist and his or her interpretation are more valuable than the study’s images. The radiologists’ value, as measured by reports viewed, was maximal with obstetricians and gynecologists and psychiatry clinicians. Key Words: Utilization, value-based care, value, viewing reports, viewing imaging J Am Coll Radiol 2018;-:---. Copyright 2018 American College of Radiology
INTRODUCTION The ready availability of images provided via radiology PACS networks is thought to add value to clinicians and radiologists alike. At the same time, the electronic medical record (EMR) provides immediate access to
a
Russell H. Morgan Department of Radiology and Radiological Science, Johns Hopkins Medical Institution, Baltimore, Maryland. b University of Pennsylvania, Philadelphia, Pennsylvania. Corresponding author and reprints: Matthew D. Alvin, MD, MBA, MS, MA, 600 N Wolfe St, Baltimore, MD 21287; e-mail:
[email protected]. Funding was obtained from the Division of Neuroradiology, Johns Hopkins Department of Radiology for this study. The authors have no conflicts of interest related to the material discussed in this article. Matthew D. Alvin, MD, MBA, MS, MA, and Mona Shahriari, MD, are co–first authors.
ª 2018 American College of Radiology 1546-1440/18/$36.00 n https://doi.org/10.1016/j.jacr.2018.03.004
radiologists’ reports, thereby improving health care quality and reducing delays that can lead to added costs [1]. However, the value of the report and the images themselves may vary by referrer, and if referring clinicians do not find benefit in one or the other, then it puts to question the role of the radiologist and the report’s function. Prior studies have demonstrated that although most internists believe the radiology report is important, less than one-half of specialists feel the same [1-3]. This importance may differ by specialty and the emergence of new technologies in different modalities for studies with which clinicians may not be comfortable (ie, ultrasound elastography, MR tractography, kurtosis maps, or CT perfusion). Alternatively, some specialists,
1
such as neurosurgeons, may be more comfortable with intracranial imaging ramifications for surgical planning than radiologists. Neurologists may use the imaging with their clinical findings to arrive at different diagnoses than neuroradiologists (based solely on imaging studies). Emergency medicine doctors may only refer to the final report for advanced imaging given the time constraints in the ED to review all images in multiseries modalities [4]. As a means of assessing a radiologist’s value in the patient care chain, we set out to determine the extent to which referring clinicians view the images and reports of the neuroradiology studies they order. Determining the clinicians’ utilization of the PACS and EMR makes it possible to evaluate if clinicians value the radiology department based on the images (PACS), rather than the reports (EMR). We hypothesized that (1) general medicine and non-neurosciences specialties, as well as outpatient providers, would rely on (view) the reports more often than neuroscience subspecialty services and inpatient providers and (2) members of the neurosciences (neurology, neurosurgery, psychiatry) and otolaryngology specialties, by virtue of their familiarity with the anatomy and pathology of the region, would view the images more frequently, without referring to the reports.
METHODS This study was performed with institutional review board approval (#00102719) and was HIPAA compliant. Through our institutional clinical data research center, data were retrospectively collected on all neuroradiology studies from all patient settings ordered over a 1-month time period (September 1, 2016, through September 30, 2016). The median follow-up for the images performed and studies reported was 197 days (interquartile range 187-205). Study modalities included MRI and MR angiography (brain, orbits, face, neck, spine) with and without intravenous contrast, CT and CT angiography (brain, spine, maxillofacial, neck, sinus) with and without intravenous contrast, fluoroscopy (lumbar puncture, myelogram), and ultrasound (neonatal brain and spine). All studies were interpreted by fellowship-trained neuroradiologists. De-identified patient data obtained included age, gender, and ethnicity. Imaging study date and time stamps were obtained for (1) when imaging study orders were placed, (2) when the patient actually underwent the imaging study, (3) when the imaging was accessed (the first time), and (4) when the radiology reports were accessed (the first time). Imaging and report access was 2
tracked through 8 months (April 2017) after the orders took place (September 2016) to allow for delays in patients undergoing their imaging studies and providers viewing the reports. This access information was obtained by accessing the viewing logs of the EMR (Epic Systems, Verona, Wisconsin) and in a separate analysis of the institutional PACS (Carestream Health, Rochester, New York). Each imaging study data point included ordering and viewing provider names and locations. Locations were classified into one of four categories: inpatient, emergency department, outpatient, or community, based on the ordering provider location at the time the order was placed. The outpatient category encompasses providers in clinics directly connected to the hospital with providers seeing patients in both the inpatient and outpatient setting. The community category encompasses providers who are part of outpatient community practices and who have minimal connection to the main hospital otherwise. Providers were classified by department and type. The medicine department category included all general internal medicine providers, as well as subspecialties, such as nephrology, pulmonology, and cardiology. The surgery category included all general and subspecialty surgeons aside from orthopedics, neurosurgery, and otolaryngology. Provider types included faculty (staff), trainee (medical student, resident, fellow), and physician assistant or nurse practitioner. Our institutional credentialing office provided a list of providers with departments and types that partially completed the data. The remaining providers were manually found through our institution’s online paging system or Internet searches. Our Neuroradiology Division is unique in that the report turnaround time during the hours between 8 AM and 11 PM averages less than 1 hour from study completion. By virtue of this rapid turnaround time, there are very few clinicians who review cases with the neuroradiology team members during these hours. Six neuroradiology attendings at our institution were surveyed regarding their average number of studies reviewed in person in the reading room with clinicians from neurology or neurosurgery within a week. Responses ranged from 0 to 2 studies per week, or 0 to 8 per month. That equates to on average <0.1% of imaging in our sample not accounted for by these in-person views that may escape our methodology for recording the views of images and reports by clinicians. Between 11 PM and 7 AM, residents read cases by themselves, creating preliminary reports, which are signed by faculty neuroradiologists the next morning between 7 AM and 9 AM. Journal of the American College of Radiology Volume - n Number - n Month 2018
Frequency tables with c2 tests were used in comparing the study reviewing rates between groups. Statistical significance was defined at P < .05. All analyses were done using Stata 14 (StataCorp LP, College Station, Texas).
RESULTS Demographics There were 8,268 imaging studies specific to neuroradiology ordered in September 2016. Of those ordered, 7,438 were completed (90%) with 1 inpatient, 190 outpatient, and 639 community provider-ordered studies excluded as not performed despite ordered (830 total). The studies were ordered for 5,185 unique patients (4,499 [87%] unique patients actually had the studies completed). Results by Modality The distribution of studies by type is shown in Table 1 and Figure 1. The most commonly ordered studies were MRI brain (2,351; 31.6%) and CT head (1,700; 22.9%), and the least commonly ordered studies were fluoroscopic imaging (103; 1.4%) and ultrasound (120; 1.6%). When analyzing all referrers, the most commonly viewed imaging studies were CT head (64.4%) and MRI brain (59.8%), and the least commonly viewed imaging studies were fluoroscopy (8.7%), CT spine (39.6%), and MRI neck (25.0%). The most commonly viewed reports were CT neck (92.2%) and ultrasound (91.7%), and the least commonly viewed reports were fluoroscopy (72.8%), MRI spine (79.3%), and CT spine (82.1%). Clinicians viewed both the images and reports of CT
head and MRI brain studies most commonly (62.6% and 58.6%, respectively), and for these studies clinicians were least likely to just view reports (26.4% and 28.8%, respectively). Overall, 85.7% (6,372) of reports were viewed, 53.2% (3,956) of images from studies were viewed, and 13.1% (977) of all orders had neither imaging nor reports viewed. Finally, only 89 (1.2%) of all orders had only imaging viewed (ie, report never viewed but images only viewed).
Results by Specialty: Images Viewed Table 2 and Figures 2 and 3 demonstrate the distribution of orders, imaging viewed, and reports viewed among departments, provider setting, and provider type. Anesthesiology (77.9%), neurosurgery (72.3%), and orthopedics (69.0%) had the most imaging viewed, and urology (17.6%), neurology (38.6%), and physical medicine and rehabilitation (PM&R) (42.2%) had the least percentages of imaging viewed (P < .001). Neurology viewed significantly (P < .001) less imaging compared with the overall (all specialties) image reviewing rate (56.0%). This disagreed from our original hypothesis. Furthermore, we found that only five specialties viewed images alone (without reports too) in more than 1% of studies: PM&R (2.2%), emergency medicine (1.6%), orthopedics (1.2%), neurology (1.1%), and otolaryngology (1.1%). Neurosurgery viewed images without reports in 0.7% and psychiatry in 0% of studies. Results by Specialty: Reports Viewed Surgery, obstetrics and gynecology, anesthesiology, urology, and psychiatry always viewed imaging in conjunction with a report; they never viewed the images alone.
Table 1. Study type Study Type
Orders
Imaging Only
Imaging (Only þ With Reports)
Reports Only
Reports (Only þ With Imaging)
Both Viewed
Neither Viewed
CT head CT maxillofacial CT neck CT spine Fluoroscopy MRI brain MRI neck MRI spine Ultrasound Total overall
1,700 558 395 687 103 2,351 132 1,392 120 7,438
30 (1.8%) 9 (1.6%) 1 (0.3%) 7 (1.0%) 0 (0%) 28 (1.2%) 1 (0.8%) 12 (0.9%) 1 (0.8%) 89 (1.2%)
1,095 (64.4%) 253 (45.3%) 216 (54.7%) 272 (39.6%) 9 (8.7%) 1,406 (59.8%) 33 (25.0%) 609 (43.8%) 63 (52.5%) 3,956 (53.2%)
449 (26.4%) 232 (41.6%) 149 (37.7%) 299 (43.5%) 66 (64.1%) 677 (28.8%) 78 (59.1%) 507 (36.4%) 48 (40.0%) 2,505 (33.7%)
1,514 (89.1%) 476 (85.3%) 364 (92.2%) 564 (82.1%) 75 (72.8%) 2,055 (87.4%) 110 (83.3%) 1,104 (79.3%) 110 (91.7%) 6,372 (85.7%)
1,065 (62.6%) 244 (43.7%) 215 (54.4%) 265 (38.6%) 9 (8.7%) 1,378 (58.6%) 32 (24.2%) 597 (42.9%) 62 (51.7%) 3,867 (52.0%)
156 (9.2%) 73 (13.1%) 30 (7.6%) 114 (16.9%) 28 (27.2%) 268 (11.4%) 21 (15.9%) 276 (19.8%) 9 (7.5%) 977 (13.1%)
“Imaging (Only þ With Reports)” is inclusive of “Imaging Only” and “Both Viewed”; “Reports (Only þ With Imaging)” is inclusive of “Reports Only” and “Both Viewed.”
Journal of the American College of Radiology Alvin et al n Viewing of Reports and Images
3
Fig 1. Each imaging study is comprised of an image and report, of which one, both, or neither was viewed.
Obstetrics and gynecology (100%), psychiatry (98.0%), and anesthesiology (93.6%) had the highest percentage of reports viewed, and urology (64.7%), neurology (72.6%), and PM&R (82.2%) had the lowest percentage of reports viewed (P < .001). In contrast to neurology, neurosurgery was one of the highest report-viewing departments (89.8%; P < .001). Besides neurology, this differed from our hypothesis that the neurosciences and ear, nose, and throat specialists would view reports the least. Of those orders with neither imaging nor reports viewed, the highest percentages were found for the departments of urology (35.3%), neurology (27.2%), and orthopedics (17.1%), and the lowest percentages were found for the departments of obstetrics and gynecology (0%), psychiatry (2.0%), and anesthesiology (6.4%) (P < .001). Neurologists were found to view the images less (see previous text) and also reading reports less (72.6% versus 85.7% overall; P < .001) than other specialties.
or community patient studies. Finally, imaging ordered by trainees had both the highest imaging (58.9%) and report viewing (94.1%) in comparison with those ordered by physician assistants or nurse practitioners (55.1% and 85.6%, respectively; P < .001) and faculty (47.2% and 78.4%, respectively; P < .001). Orders by trainees also had the lowest nonviewing rate for images and orders combined (5.4%). Table 3 compares inpatient (inpatient plus emergency) versus outpatient (outpatient plus community) studies by department. Neurosurgery outpatient studies were more likely to have only reports viewed (22.0%) than inpatient neurosurgery studies (13.8%; P < .001). This was also seen with pediatrics, anesthesiologists, neurologists, and otolaryngologists. Other specialties, such as obstetrics and gynecology, orthopedics, medicine, surgery, and PM&R, had similar views between inpatient and outpatient.
Study Site Location As far as site of study ordered, community patient study imaging was viewed the least (31.3%) and outpatient study reports were viewed the least (78.3%; P < .001). Inpatient study imaging (63.3%) and reports (95.2%) were both viewed the most (P < .001). Community clinicians never viewed images without also looking at the reports and had the highest rate of viewing only reports (50.0%) (P < .001). Of the 1,083 orders with neither imaging nor reports viewed, 672 (68.8%) were outpatient
DISCUSSION The radiology report is a critical piece in the value chain for imaging [1-12]. However, very few studies have evaluated if radiology reports are viewed, and none have been performed outside of surveying physicians regarding viewing reports and trusting their responses [3-5]. These studies rely on self-reporting, not raw data. Through the use of Epic EMR and HIPAA-compliance requirements, we are now able to ascertain when imaging or a report is viewed and by whom, at least based on
4
Journal of the American College of Radiology Volume - n Number - n Month 2018
Journal of the American College of Radiology Alvin et al n Viewing of Reports and Images
Table 2. Department, patient setting, and provider type Category Department Medicine EM Surgery OB-GYN Pediatrics Orthopedics Neurosurgery Anesthesiology Neurology Urology Otolaryngology PM&R Psychiatry Provider setting Emergency Inpatient Outpatient Community Provider type Faculty Trainee PA or NP Total overall
Orders
Imaging Only
Imaging (Only þ With Reports) (TI)
Reports Only
Reports (Only þ With Imaging) (TR)
Both Viewed
TR-TI (%)
Neither Viewed
1,992 1,408 209 54 377 252 1,460 140 1,077 17 357 45 50
10 (0.5%) 26 (1.9%) 0 (0%) 0 (0%) 3 (0.8%) 4 (1.6%) 28 (1.9%) 0 (0%) 13 (1.2%) 0 (0%) 4 (1.1%) 1 (2.2%) 0 (0%)
902 (45.3%) 639 (45.4%) 109 (52.2%) 24 (44.4%) 236 (62.6%) 174 (69.0%) 1,055 (72.3%) 109 (77.9%) 416 (38.6%) 3 (17.6%) 242 (67.8%) 19 (42.2%) 28 (56.0%)
835 (41.9%) 593 (42.1%) 78 (37.3%) 30 (55.6%) 119 (31.6%) 46 (18.3%) 284 (19.5%) 22 (15.7%) 379 (35.2%) 8 (47.1%) 71 (19.9%) 19 (42.2%) 21 (42.0%)
1,727 (86.7%) 1,206 (85.7%) 187 (89.5%) 54 (100%) 352 (93.4%) 216 (85.7%) 1,311 (89.8%) 131 (93.6%) 782 (72.6%) 11 (64.7%) 309 (86.6%) 37 (82.2%) 49 (98.0%)
892 (44.8%) 613 (43.5%) 109 (52.2%) 24 (44.4%) 233 (61.8%) 170 (67.5%) 1,027 (70.3%) 109 (77.9%) 403 (37.4%) 3 (17.7%) 238 (66.7%) 18 (40.0%) 28 (56.0%)
41.4 40.3 37.3 55.6 30.8 16.7 17.5 15.7 34.0 47.1 18.8 40.0 42.0
255 (12.8%) 176 (12.5%) 22 (10.5%) 0 (0%) 22 (5.8%) 32 (12.7%) 121 (8.3%) 9 (6.4%) 282 (26.2%) 6 (35.3%) 44 (12.3%) 7 (15.6%) 1 (2.0%)
1,690 2,390 3,166 192
30 (1.8%) 8 (0.3%) 51 (1.6%) 0 (0%)
795 (47.0%) 1,512 (63.3%) 1,589 (50.2%) 60 (31.3%)
697 (41.2%) 771 (32.3%) 941 (29.7%) 96 (50.0%)
1,462 (86.5%) 2,275 (95.2%) 2,479 (78.3%) 156 (81.3%)
765 (45.3%) 1,504 (62.9%) 1,538 (48.6%) 60 (31.3%)
39.5 31.9 28.1 50.0
198 (11.7%) 107 (4.5%) 636 (20.1%) 36 (18.8%)
3,104 2,700 1,634 7,438
55 (1.8%) 13 (0.5%) 21 (1.3%) 89 (1.2%)
1,464 (47.2%) 1,591 (58.9%) 901 (55.1%) 3,956 (53.2%)
1,024 (33.0%) 963 (35.7%) 518 (31.7%) 2,505 (33.7%)
2,433 (78.4%) 2,541 (94.1%) 1,398 (85.6%) 6,372 (85.7%)
1,409 (45.4%) 1,578 (58.4%) 880 (53.9%) 3,867 (52.0%)
31.2 35.2 30.4 32.5
616 (19.9%) 146 (5.4%) 215 (13.2%) 977 (13.1%)
EM ¼ emergency medicine; NP ¼ nurse practitioner; OB-GYN ¼ obstetrics and gynecology; PM&R ¼ physical medicine and rehabilitation; PA ¼ physician assistant; TR-TI ¼ total reports minus the total imaging, indicating how more often reports were viewed in comparison with images viewed. Department ¼ which department placed the order for the imaging study; provider setting ¼ where the imaging study order was placed; provider type ¼ who placed the order for the imaging study. “Imaging (Only þ With Reports)” is inclusive of “Imaging Only” and “Both Viewed”; “Reports (Only þ With Imaging)” is inclusive of “Reports Only” and “Both Viewed.”
5
Fig 2. Studies were separated by ordering department to analyze differences in what was viewed (imaging only, reports only, both, or neither). EM ¼ emergency medicine; OB-GYN ¼ obstetrics and gynecology; PM&R ¼ physical medicine and rehabilitation.
who is logged in on the computer through which the study or report is opened. In this study, we sought to identify who views both the imaging and the reports and which is viewed more often (ie, value in the radiologist versus the imaging). Overall, providers viewed imaging alone (without reports) on average only 1.2% of the time, strongly
implying the value of the report. In addition, the reports were viewed (85.7%) significantly more than the images (53.2%), further suggesting that the radiologist and his or her interpretation are more valuable than the radiology and its images. This discrepancy was maximal in obstetrics and gynecology, urology, and emergency medicine. On the other hand, the specialties that had the highest
Fig 3. Studies were separated by provider setting and type of provider to analyze differences in what was viewed (imaging only, reports only, both, or neither). PA/NP ¼ physician assistant or nurse practitioner.
6
Journal of the American College of Radiology Volume - n Number - n Month 2018
Table 3. Departments by inpatient or outpatient
Category Inpatient Medicine EM Surgery OB-GYN Pediatrics Orthopedics Neurosurgery Anesthesiology Neurology Urology Otolaryngology PM&R Psychiatry Total Outpatient Medicine EM Surgery OB-GYN Pediatrics Orthopedics Neurosurgery Anesthesiology Neurology Urology Otolaryngology PM&R Psychiatry Total
Imaging Orders Only
Imaging (Only þ With Reports) (TI)
Reports Only
1,266 5 (0.4%) 1,398 26 (1.9%) 117 0 (0%) 49 0 (0%) 326 3 (0.9%) 22 1 (4.6%) 457 2 (0.4%) 122 0 (0%) 190 0 (0%) 0 0 (0%) 67 0 (0%) 20 1 (5.0%) 246 0 (0%) 4,080 38 (0.9%)
628 (49.6%) 607 (45.3%) 72 (61.5%) 22 (44.9%) 216 (66.3%) 18 (81.8%) 390 (85.3%) 103 (84.4%) 138 (72.6%) 0 (0%) 55 (82.1%) 8 (40.0%) 24 (9.8%) 2,307 (56.5%)
546 (43.1%) 589 (42.1%) 41 (35.0%) 27 (55.1%) 96 (29.5%) 4 (18.2%) 63 (13.8%) 17 (13.9%) 45 (23.7%) 0 (0%) 9 (13.4%) 10 (50.0%) 21 (45.6%) 1,468 (36.0%)
726 5 (0.7%) 10 0 (0%) 92 0 (0%) 5 0 (0%) 51 0 (0%) 230 3 (1.3%) 1,003 26 (2.6%) 18 0 (0%) 887 13 (1.5%) 17 0 (0%) 290 4 (1.4%) 25 0 (0%) 4 0 (0%) 3,358 51 (1.5%)
274 (37.7%) 6 (60.0%) 37 (40.2%) 2 (40.0%) 20 (39.2%) 156 (67.8%) 665 (66.3%) 6 (33.3%) 278 (31.3%) 3 (17.6%) 187 (64.5%) 11 (44.0%) 4 (100%) 1,649 (49.1%)
289 (39.8%) 558 (76.9%) 269 (37.1%) 4 (40.0%) 10 (100%) 6 (60.0%) 37 (40.2%) 74 (80.4%) 37 (40.2%) 3 (60.0%) 5 (100%) 2 (40.0%) 23 (45.1%) 43 (84.3%) 20 (39.2%) 42 (18.3%) 195 (84.8%) 153 (66.5%) 221 (22.0%) 860 (85.7%) 639 (63.7%) 5 (27.8%) 11 (61.1%) 6 (33.3%) 334 (37.7%) 599 (67.5%) 265 (29.9%) 8 (47.1%) 11 (64.7%) 3 (17.7%) 62 (21.4%) 245 (84.5%) 183 (63.1%) 9 (36.0%) 20 (80.0%) 11 (44.0%) 0 (0%) 4 (100%) 4 (100%) 1,037 (30.9%) 2,635 (78.5%) 1,598 (35.4%)
Reports (Only þ With Imaging) (TR)
Both Viewed
1,169 (92.3%) 623 (49.2%) 1,196 (85.6%) 607 (43.4%) 113 (96.6%) 72 (61.5%) 49 (100%) 22 (44.9%) 309 (94.8%) 213 (65.3%) 21 (95.5%) 17 (77.3%) 451 (98.7%) 388 (84.9%) 120 (98.4%) 103 (84.4%) 183 (96.3%) 138 (72.6%) 0 (0%) 0 (0%) 64 (95.5%) 55 (82.1%) 17 (85.0%) 7 (35.0%) 45 (18.3%) 24 (52.2%) 3,737 (91.6%) 2,269 (55.6%)
TR-TI (%)
Neither Viewed
42.7 40.3 35.0 55.1 28.5 13.6 13.3 13.9 23.7 0 13.4 45.0 8.5 35.0
92 (7.3%) 176 (12.6%) 4 (3.4%) 0 (0%) 14 (4.3%) 0 (0%) 4 (0.9%) 2 (1.6%) 7 (3.7%) 0 (0%) 3 (4.5%) 2 (10%) 1 (2.2%) 305 (7.5%)
39.1 163 (22.5%) 40.0 0 (0%) 40.2 18 (19.6%) 60.0 0 (0%) 45.1 8 (15.7%) 17.0 32 (13.9%) 19.4 117 (11.7%) 27.8 7 (38.9%) 36.2 275 (31.0%) 47.1 6 (35.3%) 20.0 41 (14.1%) 36.0 5 (20.0%) 0 0 (0%) 29.4 672 (20.0%)
EM ¼ emergency medicine; OB-GYN ¼ obstetrics and gynecology; PM&R ¼ physical medicine and rehabilitation; TR-TI ¼ total reports minus the total imaging, indicating how more often reports were viewed in comparison with images viewed; “Imaging (Only þ With Reports)” is inclusive of “Imaging Only” and “Both Viewed”; “Reports (Only þ With Imaging)” is inclusive of “Reports Only” and “Both Viewed” ; “Total Reports” is inclusive of “Reports Only” and “Both Viewed.”
rate of images viewed to reports viewed were anesthesiology, orthopedics, neurosurgery, and otolaryngology. Neurology was low for both viewing images and viewing reports and had a relatively high rate of nonviews of both. How do we explain these patterns? Based on our departmental physician interactions, specialties other than neurology or neurosurgery likely are less comfortable with neuroradiology studies, partly because of lower volumes ordered. Otolaryngology ordering in neuroradiology is dominated by CT sinus imaging, which is more basic to interpret and requires image viewing by otolaryngologists to plan surgical approaches and interventions. We surmise that the high rate of images viewed to reports viewed by anesthesiologists is because of the prevalence of Journal of the American College of Radiology Alvin et al n Viewing of Reports and Images
anesthesiologists that man our institutional neuro-critical care units and the high volume of studies ordered. Similarly, anesthesiologists that do pain management procedures likely look at imaging studies at a higher rate. Neurosurgeons fell within the expected range in the study, having both a very high imaging view rate (72.3%) and report view rate (89.8%). In contrast, it was surprising that neurologists had a high rate of nonviews of both. The cause of this is the subject of a planned follow-up study. Prior studies have also found variability in viewing reports. In 2011, Bosmans et al [3] conducted an Internet survey for clinical specialists and general practitioners (n ¼ 735) and another for radiologists (n ¼ 138). They found that 83% of clinicians said they read the report as 7
soon as it is available with only 8% reporting that they do not read the report at all. The present study looked instead at actual viewing of imaging and reports based on access through the EMR rather than survey data, with a much higher percentage (14.3%) of providers not reading reports. Part of this discrepancy is likely because of our higher sample size and possibly elements of nonresponse bias and low response rate (21%) in the Bosmans et al [3] surveys. Self-reporting by clinicians is likely to have bias that will not obscure our methodology of data retrieval through electronic monitoring. Similarly, Branco et al [4] surveyed 63 inpatient neurologists, neurosurgeons, and psychiatrists regarding their perception of reports for conventional brain MRI. Most neurologists (90%) and neurosurgeons (93%) read both the report and images, and 39% of psychiatrists and 11% of neurologists only read the report [4]. No neurosurgeons read only the report, which makes sense because neurosurgeons usually view the imaging for resections or biopsies and have more experience in viewing the imaging than other specialties, such as psychiatrists who likely rely more only on the reports. However, in the bustle of clinic, for follow-up postoperative studies, neurosurgeons may not review imaging for every patient and instead just view the report to confirm absence of residual disease or recurrence. Our results confirm this hypothesis and show not only that outpatient studies ordered by neurosurgeons (in comparison with inpatient) are less likely to have images viewed, but also neurosurgeons are more likely to rely only on the reports. In addition, outpatient neurosurgery studies are more likely to never have either images or reports accessed compared with inpatient studies. Greater than expected, we found that 13.1% of all ordered and completed studies had neither reports nor images viewed. Given the costs of imaging and radiation exposure, this finding should be investigated further as to why these studies are performed and why they are not being addressed. Our data also found that the community and outpatient providers combined represented 69% of all studies that had neither imaging nor reports viewed. Multiple explanations may be responsible for this finding. Do these providers have difficulty in accessing the report? Do they rely on our robust critical findings communication program to alert them [13,14]? Do they have another means to retrieve the reports? Are these patients being lost to follow-up? No matter the cause, this too deserves further investigation. Multiple limitations are inherent in the present study, some of which may help explain discrepancies with report and imaging views: 8
n
n
n
The follow-up period of 8 months after ordering was not based on any specific criteria; nonetheless, we felt confident that this period of time could account for patient delays in completing the examinations and providers viewing the reports at typical outpatient follow-up intervals (eg, 3- or 6-month oncologic follow-up). The algorithm for obtaining the data was based on a provider clicking on the radiology report in the EMR system. To increase comprehensiveness, we also obtained logs of PACS access of imaging and reports. This was a comprehensive yet imperfect way of assessing provider viewership of reports but complies with HIPAA regulations. B For example, some inpatient attending physicians may have viewed the reports on a trainee’s computer screen—this would be classified as a view by a trainee, not an attending looking over the trainee’s shoulder. However, whoever did log in or access the imaging or report was assumed to have relayed any critical information to the care team or attending physician. B In addition, some outpatient providers may obtain access to reports outside of the EMR and PACS (eg, fax from an imaging center), which would not be included in our data and, thus, overestimate the percentage of reports or imaging not viewed. B Regarding in-person reviews of imaging in the reading room, as mentioned in the Results section, this is extremely rare in our neuroradiology department, representing <0.1% of our data. However, tumor board review of studies may account for some views not recorded. B Critical findings discussed over the telephone may cause a provider to not view a report or imaging study. Because these account for a small percentage of all studies performed, we believe this does not account for a large source of error. Care provider behavior ordering and viewing patterns may have influenced the results. For example, if a single provider in a specific specialty orders studies more frequently than his or her colleagues and chooses to only view images or only view reports, then the percentage for that specialty of only viewing images or reports may be skewed because of that provider. However, given the large size of our academic quaternary care center, we assumed this did not significantly alter the results.
Despite these limitations, this study represents the first comprehensive evaluation of neuroradiology report and study viewership. Future studies may be performed Journal of the American College of Radiology Volume - n Number - n Month 2018
aimed at strategies by which radiologists provide greater value to the health care system.
CONCLUSION Utilization of imaging and reports varies by department, provider setting, and provider level of training. Two of the specialties that are primary referrers to neuroradiology, neurosurgery and otolaryngology, rely more heavily on the images than the reports compared with other departments, but neurology less so. However, neurosurgeons view images alone, without viewing reports, in less than 1% of all cases. However, overall the reports (85.7%) are viewed much more commonly than the images (53.2%), suggesting radiologists’ value remains paramount. We were surprised to find that one in eight neuroradiology studies ordered had neither its report nor imaging viewed and that this occurred in 27.2% of cases ordered by neurologists. These findings are currently being analyzed in a separate study. TAKE-HOME POINTS -
-
-
Viewing of imaging and reports varies among departments, with neurosurgeons and ENTs viewing neuroradiology studies more than other nonneurosciences specialties. Overall, radiology reports were viewed significantly more than the images, and images alone were viewed in less than 1.2% of all studies, strongly suggesting that the radiologist and his or her interpretation are more valuable than the study’s images. One in eight neuroradiology studies had neither its report nor imaging viewed, with 27.2% of cases ordered by neurologists, which raises concerns regarding cost and radiation exposure.
Journal of the American College of Radiology Alvin et al n Viewing of Reports and Images
REFERENCES 1. Dako F, Schreyer K, Burshteyn M, Cohen G, Belden C. Expanding radiology’s role in a value-based health economy. J Am Coll Radiol 2017;14:622-4. 2. Kabadi SJ, Krishnaraj A. Strategies for improving the value of the radiology report: a retrospective analysis of errors in formally over-read studies. J Am Coll Radiol 2017;14:459-66. 3. Bosmans JML, Weyler JJ, De Schepper AM, Parizel PM. The radiology report as seen by radiologists and referring clinicians: results of the COVER and ROVER surveys. Radiology 2011;259: 184-95. 4. Branco P, Ayres-Basto M, Portugal P, Ramos I, Seixas D. Brain magnetic resonance imaging: perception and expectations of neurologists, neurosurgeons and psychiatrists. Neuroradiol J 2014;27: 261-7. 5. Tappouni R, Sarwani N, Bruno M. What does the “customer” really want? How do clinicians read the radiology report and what are their preferences regarding communication unexpected findings: a survey of academic center staff physicians. Paper presented at: RSNA 2009 Annual Meeting; November 29 to December 4, 2009; Chicago, IL. Available at: http://archive.rsna.org/2009/8004098.html. Accessed June 24, 2017. 6. Clinger NJ, Hunter TB, Hillman BJ. Radiology reporting: attitudes of referring physicians. Radiology 1988;169:825-6. 7. Lukaszewicz A, Uricchio J, Gerasymchuk G. The art of the radiology report: practical and stylistic guidelines for perfecting the conveyance of imaging findings. Can Assoc Radiol J 2016;67: 318-21. 8. Tublin ME, Deible CR, Shrestha RB. The radiology report version 2.0. J Am Coll Radiol 2015;12:217-9. 9. Wallis A, McCoubrie P. The radiology report—are we getting the message across? Clin Radiol 2011;66:1015-22. 10. IMV 2016 CT market summary report. Des Plains, IL: IMV Medical Information Division; 2016. 11. Guite KM, Hinshaw JL, Ranallo FN, Lindstrom MJ, Lee FT Jr. Ionizing radiation in abdominal CT: unindicated multiphase scans are an important source of medically unnecessary exposure. J Am Coll Radiol 2011;8:756-61. 12. Johnson PT, Mahesh M, Fishman EK. Image Wisely and Choosing Wisely: importance of adult body CT protocol design for patient safety, exam quality, and diagnostic efficacy. J Am Coll Radiol 2015;12:1185-90. 13. Viertel VG, Trotter SA, Babiarz LS, et al. Reporting of critical findings in neuroradiology. AJR Am J Roentgenol 2013;200: 1132-7. 14. Babiarz LS, Lewin JS, Yousem DM. Continuous practice quality improvement initiative for communication of critical findings in neuroradiology. Am J Med Qual 2015;30:447-53.
9