OPINION
The Radiology Report Version 2.0 Mitchell E. Tublin, MD, Christopher R. Deible, MD, PhD, Rasu B. Shrestha, MD Radiologists have been appropriately concerned about their relevance in an era in which they no longer physically control imaging and in which imaging-based diagnoses and management decisions based on those diagnoses are often made quickly by referring subspecialty colleagues. Community and academic radiology groups have attempted to compensate for this paradigm shift by distributing workload around the clock and by incentivizing quick report turnaround times. Integrated voice recognition software has been used to reassert the preeminent role of radiology in image interpretation. Studies are transmitted almost instantaneously; the radiology report now follows in several minutes. Despite this remarkable paradigm shift, the radiology report has until recently been similar to those typed generations ago. Critics within and outside the radiology community have bemoaned a lack of consistent terminology and have advocated the adoption of structured reporting to improve communication. In our own practice, we have noticed a subtle shift in reporting, which, although “structured,” is driven not solely by a desire to communicate effectively but rather to adapt to turnaround metrics and voice recognition software. A radiologist may choose between timeconsuming self-editing of free-text reports or use the templating functions that are emphasized in current voice recognition programs. Turnaround times trump all considerations, and most radiologists ultimately choose self-survival: findings are itemized, a short impression is rendered, the report is signed, and on to the next case. In our opinion, this is an ominous trend, especially if left as a “one size fits all” model.
Advocates of template reports refer to studies that have shown that end users prefer the clarity of these highly structured documents over standard free text [1,2]. Nonetheless, the literature comparing reporting styles consists primarily of surveys and retrospective report audits. Few studies have evaluated report “quality,” namely, how well radiology reports convey the results of image interpretation and how effectively end users extract that information. In a recent exhaustive 2010 literature review of the topic, Pool and Goergen [2] identified two randomized studies that examined report quality per se. In a cohort study comparing structured reporting with conventional dictation, Johnson et al [3] demonstrated decreases in accuracy and completeness with template reporting. This gap has since been attributed to the constraining and distracting point-and-click structured reporting software used in the study. A randomized controlled trial performed by Sistrom and Honeyman-Buck [4] showed that the accuracy and efficiency of information extraction were equivalent for free-text and itemized reports. Although reviewers expressed a subjective preference for structured reports, the authors tellingly cautioned against fundamental changes in how reports are authored and displayed. The authors referred to cognitive psychology studies that have suggested how concise but unconstrained free-text may be the most effective forum for complex but shared workflows and how overly structured data may cause clinicians to lose cognitive focus during data input and review [5-8]. A follow-up randomized study that assessed the influence of radiology report format on reading time and comprehension similarly found no difference
ª 2014 American College of Radiology 1546-1440/14/$36.00 http://dx.doi.org/10.1016/j.jacr.2014.04.014
between conventional free text and structured text (organized by either clinical significance or organ system). Nonetheless, physicians with differing levels of expertise and from different specialties had different preferences and reading approaches. The authors cautioned that “although format has little impact on time to read or comprehension of details, a one-size-fits-all or standardized report is unlikely to be very popular as there are significant individual differences and styles of reading the reports” [9]. Structured reporting is now standard in some clinical specialties, such as cardiology [10]. These itemized reports have set attributes that often make them seemingly enticing for referring clinicians. These include modular section headings; an orderly and consistent listing of observations, often by anatomic areas; and a standardized language pattern that adheres to a standard lexicon. Breast imaging has already adopted structured reporting well, allowing mammographers to choose from a standard set of options such as the 6 BI-RADS categories [11] to express findings such as the likelihood of cancer. Breast imaging, however, is a perfect example of a clinical specialty that deals mostly with a limited anatomic area and a restricted, albeit important, set of possible findings and normal screening results. Structured reporting makes further sense in mammography because its adoption aids in adhering to guidelines mandated by the Mammography Quality Standards Act, which primarily is intended to maintain high-quality mammography. The legislation also requires facilities to provide patients with the results of their mammographic examinations in language in a “lay report” that is easy to understand [12]. 1
2 Opinion
However, a one-size-fits-all approach does not work for radiology reporting, and in particular, interpreting cross-sectional studies such as CT and MRI often calls for a more liberal flow in reporting what often are complex cases with a mix of anatomic and clinical correlations. In this era of value-based imaging supplemented by clinical pathways, these reports should be powerful, clinically relevant, actionable narratives that highlight the interpretation skills of imaging consultants. There is a better “middle ground,” one aided by technology whose time has come. What we need is a subspecialty consultative reporting system that allows free-flow narratives that are aided by speech recognition with real-time natural language processing (NLP) to help bring out the best of what structured reporting has been trying to achieve. NLP is an area of human-computer interaction that deals with the process of extracting and structuring natural language into formal computer representations. NLP allows computational linguistics to influence the most valuable form of clinical documentation: the clinical narrative. Significant progress has been made in the area of NLP, and we are not far from being able to envision a more flexible and intelligent radiology report aided by NLP. Combining ongoing advancements in NLP with medical artificial intelligence and speech recognition technology will make it possible for radiologists to leverage the best of both worlds, such that we can aim for comprehensive reporting through narrative dictation, while automating the process of capturing key clinical data in a more structured manner. The ultimate goal is to have a conversational reporting paradigm that is more natural in flow and more intelligent in format. This better middle ground could result in a report that captures the spirit of the intended communication
while leveraging semantic content management and lexicons that result in a more valuable report, with pertinent images hyperlinked to key findings. NLP components of the reporting system will ensure that reporting standards are met, delivering the advantages of structured reporting, without altering an individual radiologist’s approach to interpreting an examination. Certainly, for the right clinical specialties, relevant and contextualized templates would still be used as freely as needed. The hybrid free-dictation approach also affords the freedom for radiologists to maintain their focus on the images and related clinical data, as opposed to being distracted by having to focus on fixed areas of the reporting template. Additionally, this approach also allows a more humanized report, one that allows a more articulate communication of key findings that may actually be more clinically relevant. NLPenabled reports may ultimately be customized for stakeholders (consultant subspecialists, primary care providers, patients, etc) with differing skill sets and priorities. Automated incorporation of relevant data such as technique, relevant history, and references appropriate for differing consumers would truly bring out the best in NLP algorithms and would further aid in clinical communication processes. Structured components could complement the report and be evident in more standard tasks, such as tumor reporting, staging, and classification. Our quest for a more logical hybrid approach to reporting is aided by NLP to achieve key goals of clearer communication, adherence to quality reporting standards, and the creation of a visually rich contextual information presentation. Technologically, we are at the point at which speech recognition need not merely be about an often distracting “typing with your tongue” read-flow but about setting the framework of an “intelligent
conversational reporting” paradigm that allows radiologists to actually focus on the images and clinical findings. The hybrid approach still allows relevant, contextualized templates where warranted. The goals should be to improve consistency (by incorporating accepted lexicons), clarity (by prioritizing relevant findings on the basis of the clinical question raised and the scenarios being explored), and completeness (by flagging reports in which clinically relevant or billable items are incomplete or lacking in specificity). In an insightful paper that laid out the realities of the past, present, and future of the radiology report, Reiner et al [13] concluded that “the time seems ripe for a new reporting solution.” We agree and argue that there is a middle-ground approach, one that allows us to speak consistently, accurately, and completely as clinical consultants, within the clinical pathways that define best practices for individual clinical subspecialties. The new normal for the radiology report, version 2.0, should be an inherently flexible, NLP-based framework that allows automation where appropriate and efficient, with intelligent reporting that caters to end users, and highlights the human aspects of the conversation we as radiologists need to have with ordering physicians and our patients. REFERENCES 1. Schwarz LH, Panicek DM, Berk AR, Yuelin Li, Hricak H. Improving communication of diagnostic radiology findings through structured reporting. Radiology 2011;260:174-81. 2. Pool F, Goergen S. Quality of the written radiology report: a review of the literature. J Am Coll Radiol 2010;7:634-43. 3. Johnson AJ, Chen MY, Swan JS, Applegate KE, Littenberg B. Cohort study of structured reporting compared with conventional dictation. Radiology 2009;251: 74-80. 4. Sistrom CL, Honeyman-Buck J. Free text versus structured format: information
Opinion 3
transfer efficiency of radiology reports. AJR Am J Roentgenol 2004;185:804-12. 5. Ash JS, Berg M, Coiera E. Some unintended consequences of information technology in health care: the nature of patient care information system-related errors. J An Med Inform Assoc 2004;11: 104-12. 6. Garrod S. How groups co-ordinate their concepts and terminology: implications for medical information. Methods Inf Med 1998;37:471-6. 7. Patel VL, Kaufman DR. Medical informatics and the science of cognition. J An Med Inform Assoc 1998;5:493-502.
8. Patel VL, Kushinuk AW. Understanding, navigating and communicating knowledge: issues and challenges. Methods Inf Med 1998;37:460-70.
11. American College of Radiology. Breast Imaging Reporting and Data System (BIRADS). Reston, Virginia: American College of Radiology; 2006.
9. Krupinski EA, Hall ET, Jaw S, Reiner B, Siegel E. Influence of radiology report format on reading time and comprehension. J Digit Imaging 2012;25:63-9.
12. US Food and Drug Administration. Mammography reports—are you doing right by your patients? Available at: http:// www.fda.gov/Radiation-EmittingProducts/ MammographyQualityStandardsActandPro gram/FacilityScorecard/ucm113812.htm. Accessed August 17, 2013.
10. Douglas PS, Hendel RC, Cummings JE, et al. ACCF/ACR/AHA/ASE/ASNC/HRS/ NASCI/RSNA/SAIP/SCAI/SCCT/SCMR 2008 health policy statement on structured reporting in cardiovascular imaging. Circulation 2009;119:187-200.
13. Reiner BI, Knight N, Siegel EL. Radiology reporting, past, present, and future: the radiologist’s perspective. J Am Coll Radiol 2007;43:313-9.
Mitchell E. Tublin, MD, Christopher R. Deible, MD, PhD, and Rasu B. Shrestha, MD, are from the Department of Radiology, University of Pittsburgh School of Medicine, Pittsburgh, Pennsylvania. Mitchell E. Tublin, MD, University of Pittsburgh School of Medicine, Department of Radiology, 200 Lothrop Street, Pittsburgh, PA 15213-2582; e-mail:
[email protected].