DERMATOPATHOLOGY
Validation of diagnostic accuracy with whole-slide imaging compared with glass slide review in dermatopathology Kabeer K. Shah, DO,a,b Julia S. Lehman, MD,a,c Lawrence E. Gibson, MD,a,c Christine M. Lohse, MS,d Nneka I. Comfere, MD,a,c and Carilyn N. Wieland, MDa,c Rochester, Minnesota Background: Teledermatopathology has evolved from static images to whole slide imaging (WSI), which allows for remote viewing and manipulation of tissue sections. Previous studies of WSI in teledermatopathology predated College of American Pathologists (CAP) telepathology validation guidelines. Objective: We conducted a comprehensive retrospective WSI validation study of routine dermatopathology cases, adhering to CAP guidelines. Method: In all, 181 consecutive cases arranged into 3 categories (inflammatory, melanocytic, nonmelanocytic proliferations) were reviewed by 3 board-certified dermatopathologists via traditional microscopy (TM) and WSI. Intraobserver (TM vs WSI), TM intraobserver and interobserver (TM vs TM), and WSI interobserver (WSI vs WSI) concordance was interpreted using a 3-tier system. Results: TM versus WSI intraobserver concordance (86.9%; 95% confidence interval [CI] 83.7-89.6) did not differ from TM versus TM intraobserver concordance (90.3%; 95% CI 86.7-93.1) or interobserver concordance (WSI: 89.9%; 95% CI 87.0-92.2, and TM: 89.5%; 95% CI 86.5-91.9). Melanocytic proliferations had the lowest TM versus WSI intraobserver concordance (75.6%; 95% CI 68.5-81.5), whereas inflammatory lesions had the highest TM versus WSI intraobserver concordance (96.1%; 95% CI 91.8-98.3). Nonmelanocytic proliferations had an intraobserver concordance of 89.1% (95% CI 83.4-93.0). Limitations: Efficiency and other logistical WSI parameters were not evaluated. Conclusion: Intraobserver and interobserver diagnostic concordance between WSI and TM was equivalent. Therefore, WSI appears to be a reliable diagnostic modality for dermatopathology. ( J Am Acad Dermatol http://dx.doi.org/10.1016/j.jaad.2016.08.024.) Key words: concordance; dermatopathology; inflammatory; interobserver; intraobserver variability; melanocytic; nonmelanocytic; validation; variability; whole slide imaging.
T
echnology has catalyzed the evolution of telepathology from the use of static (storeand-forward) images to real-time video streaming and more recently whole slide imaging (WSI).1,2 Unlike static photographs and realtime imaging, WSI scans entire slides at various From the Department of Laboratory Medicine and Pathology,a Mayo School of Graduate Medical Education,b Division of Dermatopathology and Cutaneous Immunopathology, Department of Dermatology,c and Division of Biomedical Statistics and Informatics,d Mayo Clinic. Funding sources: None. Conflicts of interest: None declared. Presented at the 19th Joint Meeting of the International Society of Dermatopathology in Washington, DC, on March 2-3, 2016.
Abbreviations used: CAP: CI: IHC: TM: WSI:
College of American Pathologists confidence interval immunohistochemical traditional microscopy whole slide imaging
Accepted for publication August 11, 2016. Reprint requests: Carilyn N. Wieland, MD, Department of Dermatology, Mayo Clinic, 200 First St SW, Rochester, MN 55905. E-mail:
[email protected]. Published online October 11, 2016. 0190-9622/$36.00 Ó 2016 by the American Academy of Dermatology, Inc. http://dx.doi.org/10.1016/j.jaad.2016.08.024
1
J AM ACAD DERMATOL
2 Shah et al
n 2016
magnifications allowing the observer the ability to WSI modalities. Reviewers were blinded to the select and zoom to areas of interest via a digital original diagnosis. The primary outcome measure interface. The field of dermatopathology has a was degree of diagnostic concordance between novel opportunity to use telepathology, so-called modalities (TM vs WSI). Secondary outcome mea‘‘teledermatopathology,’’ because of its frequent sures included: intraobserver and interobserver need for subspecialist expert consultation, small concordance of TM versus TM, and interobserver specimen sizes, low slide counts, and overall high concordance of WSI versus WSI. Fig 1 illustrates the case volume.3 WSI has been study design. The rationale evaluated in dermatopatholfor intraobserver agreement CAPSULE SUMMARY ogy as a tool for routine (TM vs TM) was to document interpretation, remote conthe variability inherent in our Whole slide imaging is a digital imaging sultation, research, educacurrent practice and as a technology that allows for remote tion, and collaboration.3-19 comparison with TM versus viewing interpretation of slides. Studies that used WSI WSI. Interobserver agreeThis study demonstrated high reported a 75% to 93.2% ment (TM vs TM and WSI vs intraobserver and interobserver concordance with traditional WSI) served to highlight any concordance in the interpretation of microscopy (TM).3,10-12,14-16,20 potential error or bias introwhole slide imaging and traditional duced by the diagnostic Leinweber and colleagues,10 microscopy for dermatopathology cases. modality. All participants evaluating 560 melanocytic underwent training before lesions by WSI and TM, Although larger, confirmatory studies are proceeding with digital found a 93.2% concordance needed, whole slide imaging may offer interpretation. An 8-week when using a binary benign diagnostic equivalence to traditional washout period was or malignant scoring sysmicroscopy in dermatopathology. observed between modaltem.10 In a smaller but ities. The washout was diverse case series by Alincreased from the CAP minimum recommendation Janabi et al,16 authors found a lower overall concorof 2 weeks to reduce recall bias.2 Each participant dance, 73% to 96% per reviewer. These early studies in the field of teledermatopathology illustrate proof independently reviewed cases and recorded his or of concept, but may not reflect a clinically relevant her diagnoses. A secure digital database (Access, method of practice that also accounts for Microsoft, Redmond, WA) was designed to mimic a interobserver and intraobserver discordance with laboratory information system. Each case reproTM alone, making interpretation difficult. duced the patient demographics and clinical The College of American Pathologists (CAP) information that would be present on the pathology recently published standardized guidelines for requisition form; however, no clinical images were validating telepathology systems, with the goals of made available because of variable availability. reducing recall bias, diagnostic errors, and creating Intraobserver (TM vs WSI and TM vs TM) and awareness among end users of these systems in interobserver (TM vs TM and WSI vs WSI) diagnoses clinical practice.2 were compared and consensus diagnosis derived by Of the multiple studies on teledermatopathology the majority TM diagnostic opinion. published to date,3-29 all predated the CAP guidelines for telepathology validation. The primary Case selection objective of this study is to validate the use of WSI in A total of 181 cases were included for this study. the primary diagnosis of routine dermatoses encounTo mirror clinical practice, a consecutive series of tered in daily dermatopathology practice, using the completed dermatopathology cases were examined recently published CAP consensus guidelines. and sorted into 3 categories evenly: inflammatory, melanocytic, and nonmelanocytic proliferations. Inflammatory category included examples such as METHODS psoriasiform and interface dermatitis diagnoses. Study design Melanocytic category included various types of The study was approved by the institutional nevi, atypical nevi, melanoma in situ, and melanoma. review board as minimal risk protocol (IRB12Nonmelanocytic cases included benign and 008844) and followed the recommendations malignant keratinocytic lesions, such as seborrheic outlined by CAP (Table I).2 Three board-certified keratosis and squamous cell carcinoma. Cases were practicing dermatopathologists participated as excluded if received for consultation, re-excision, reviewers for this study (L. E. G., J. S. L., C. N. W.) duplicate diagnoses from a single patient, or did not and retrospectively reviewed cases using TM and d
d
d
J AM ACAD DERMATOL
Shah et al 3
VOLUME jj, NUMBER j
Table I. College of American Pathologists whole slide imaging validation guideline statements for diagnostic purposes in pathology Guideline statement
Grade of evidence
1. All pathology laboratories implementing WSI technology for clinical diagnostic purposes should carry out their own validation studies. 2. Validation should be appropriate for and applicable to the intended clinical use and clinical setting of the application in which WSI will be employed. Validation of WSI systems should involve specimen preparation types relevant to intended use (eg, formalin-fixed paraffin-embedded tissue, frozen tissue, immunohistochemical stains, cytology slides, hematology blood smears). Note: If a new intended use for WSI is contemplated, and this new use differs materially from the previously validated use, a separate validation for the new use should be performed. 3. The validation study should closely emulate the real-world clinical environment in which the technology will be used. 4. The validation study should encompass the entire WSI system. Note: It is not necessary to validate separately each individual component (eg, computer hardware, monitor, network, scanner) of the system nor the individual steps of the digital imaging process. 5. Revalidation is required whenever a significant change is made to any component of the WSI system. 6. A pathologist(s) adequately trained to use the WSI system must be involved in the validation process. 7. The validation process should include a sample set of at least 60 cases for 1 application (eg, H&E-stained sections of fixed tissue, frozen sections, cytology, hematology) that reflects the spectrum and complexity of specimen types and diagnoses likely to be encountered during routine practice. Note: The validation process should include another 20 cases for each additional application (eg, immunohistochemistry, special stains). 8. The validation study should establish diagnostic concordance between digital and glass slides for the same observer (ie, intraobserver variability). 9. Digital and glass slides can be evaluated in random or nonrandom order (as to which is examined first and second) during the validation process. 10. A washout period of at least 2 weeks should occur between viewing digital and glass slides. 11. The validation process should confirm that all of the material present on a glass slide to be scanned is included in the digital image. 12. Documentation should be maintained recording the method, measurements, and final approval of validation for the WSI system to be used in the clinical laboratory.
Expert consensus opinion Recommendation Grade A
Recommendation Grade A Recommendation Grade B
Expert consensus opinion Recommendation Grade B Recommendation Grade A
Suggestion Grade A Recommendation Grade A Recommendation Grade B Expert consensus opinion Expert consensus opinion
Guideline statements for validating WSI systems for diagnostic purposes with recommendation grades, as recommended by College of American Pathologists consensus publication. H&E, Hematoxylin and eosin; WSI, whole slide imaging. Reprinted from Pantanowitz et al2 with permission from Archives of Pathology & Laboratory Medicine. Copyright 2013 College of American Pathologists.
demonstrate diagnostic histopathologic features. All cases were routinely processed and stained by hematoxylin-eosin. For each of the 3 categories, the first 60 consecutive cases were selected with the following caveats. The melanocytic category required at least 10 melanoma diagnoses. Nonmelanocytic cases were limited to 20 diagnoses of squamous cell carcinoma and basal cell carcinoma each. In addition, 1 histologic case without diagnostic abnormality was included in the nonmelanocytic category. Previously ordered immunohistochemical (IHC) stains were excluded, and a single representative diagnostic slide was selected by an anatomic pathology resident physician (K. K. S.). Slides were
de-identified and randomized for independent digital and glass interpretation. IHC has been included in previous studies; however, this was done either prospectively or algorithmically when cases were retrospectively reviewed.30 In addition, per the CAP guidelines, IHC evaluation required additional, independent validation, which the authors believed could be accomplished in a separate study. Whole slide imaging All cases were scanned at 340 magnification by a slide scanner (Aperio XT mage Scope, Leica Biosystems, Buffalo Grove, IL) and loaded onto the accompanying proprietary server (Spectrum E-Slide
J AM ACAD DERMATOL
4 Shah et al
n 2016
to grade atypical nevi (ie, moderate, severe). Importantly, no diagnoses were altered as a result of consensus review performed by this study. Statistical analysis Concordance was calculated for each case subtype and methodology as the number of cases in agreement divided by the total number of cases evaluated. To compare interpretation methodologies, 95% confidence intervals (CI) were derived using the score method incorporating continuity correction.31
RESULTS Fig 1. Relationship map of study design between observers (left) and imaging modality (right). Arrows represent interpretations (inset). TM, Traditional microscopy; WSI, whole slide imaging.
Manager, v.11.2.0.780, Leica Biosystems). Images were accessed through a local server via an individual login. Reviewers were asked to give qualitative feedback regarding slide quality, network access, and review time throughout the WSI evaluation. Glass interpretation The selected cases were randomized and distributed to study dermatopathologists and diagnoses recorded digitally to simulate routine interpretation. Case agreement TM and WSI case diagnoses were scored by a single blinded dermatopathologist (C. N. W.) in a 3-tier system as compared with the case consensus diagnosis. A score of ‘‘0’’ corresponded to complete agreement, ‘‘1’’ minor disagreement, and ‘‘2’’ major disagreement. Consensus was determined by majority agreement by all reviewers. Diagnostic agreement in inflammatory cases included pathologic differential diagnoses that overlapped with the clinical presentation and if the correct diagnosis could be reached by the requesting clinicians. Minor disagreement was recorded in scenarios without clinical impact. Conversely, major disagreement was recorded when clinical impact may have occurred. Melanocytic and nonmelanocytic case disagreement was characterized by degrees of diagnostic separation (ie, minor disagreement: actinic keratosis vs squamous cell carcinoma in situ), whereas nevus with moderate atypia versus melanoma was considered a major disagreement. It is important to note, it is the practice at our institution
A total of 181 cases were reviewed by 3 dermatopathologists independently via TM and WSI. The overall intraobserver concordance between TM versus WSI for all cases was 86.9% (95% CI 83.7-89.6) among the 3 observers (Table II). Inflammatory diagnoses (N = 60) had the highest overall concordance at 96.1% (95% CI 91.8-98.3) with an overall minor disagreement rate of 3.3% (N = 6) and a major disagreement rate of 0.6% (N = 1). The single major disagreement was inflamed psoriasiform dermatitis, which was digitally identified as verrucal keratosis. Melanocytic diagnoses (N = 60) yielded the lowest overall concordance at 75.6% (95% CI 68.5-81.5) with 21.6% (N = 39) minor disagreement and 2.8% (N = 5) major disagreement. Nonmelanocytic diagnoses (N = 61) had an overall concordance of 89.1% (95% CI 83.4-93.0) with 6.6% (N = 12) minor disagreement and 4.3% (N = 8) major disagreement. All of the nonmelanocytic major disagreements were related to degree of atypia in keratinocytic lesions. The major disagreements recorded within each category are shown in Table III. The intraobserver concordance of TM versus TM was 90.3% (95% CI 86.7-93.1), which is statistically similar as compared with the intraobserver concordance of TM versus WSI for all disease categories as shown in Fig 2. Similar to TM versus WSI, inflammatory intraobserver TM cases had the highest average concordance at 98.3% (95% CI 93.5-99.7) with 1.7% (N = 2) minor disagreement with no major disagreements. Melanocytic cases had the lowest concordance of 80.0% (95% CI 71.5-86.5) with 18.3% (N = 22) minor disagreement and 1.7% (N = 2) major disagreement. Nonmelanocytic cases had a concordance of 92.6% (95% CI 86.1-96.4) with 5.0% (N = 6) minor disagreement and 2.4% (N = 3) major disagreement. Interobserver concordance was also evaluated and found to be similar between TM and WSI methodologies (TM vs TM: 89.5%; 95% CI
J AM ACAD DERMATOL
Shah et al 5
VOLUME jj, NUMBER j
Table II. Traditional microscopy vs whole slide imaging intraobserver concordance by case subtype Case subtype
Inflammatory Melanocytic Nonmelanocytic Overall TM vs WSI
Total agreements
Total interpretations
Concordance %
95% Confidence interval
173 136 163 472
180 180 183 543
96.1 75.6 89.1 86.9
91.8-98.3 68.5-81.5 83.4-93.0 83.7-89.6
The cases per subtype are listed with the total agreements, total interpretations, concordance, and 95% confidence intervals rendered by the 3 study pathologists. TM, Traditional microscopy; WSI, whole slide imaging.
Table III. Major disagreement cases and diagnoses: traditional microscopy versus whole slide imaging intraobserver agreement Case no.
TM diagnosis
Inflammatory 154 Psoriasiform dermatitis and subcorneal pustule Melanocytic 12 Melanoma 14 Severely atypical dermal MP with LCN* 88 Melanoma* 118 LCN with moderate atypia 162 CN Nonmelanocytic 35 Pigmented AK* 46 75 81 99
Extensive scattered dyskeratotic keratinocytes with epidermal necrosis* SCCIS with follicular extension* SCC Superficially invasive SCC with lichenoid inflammation
WSI diagnosis
Consensus diagnosis
Inflamed VK*
Psoriasiform dermatitis
CN with moderate atypia* CN LCN with moderate atypia MMIS with nevus* CN with severe atypia*
Melanoma CN CN with moderate atypia CN with moderate atypia CN
SCC (focally superficially invasive) arising in AK SCC
SCC
ISK ISK* ISK with squamous atypia, favor reactive*
ISK SCC SCC
SCC
AK, Actinic keratosis; CN, compound nevus; ISK, inflamed seborrheic keratosis; LCN, lentiginous compound nevus; MMIS, malignant melanoma in situ; MP, melanocytic proliferation; SCC, squamous cell carcinoma; SCCIS, squamous cell carcinoma in situ; TM, traditional microscopy; VK, verrucal keratosis; WSI, whole slide imaging. *Major disagreement in each case.
86.5-91.9, and WSI vs WSI: 89.9%; 95% CI 87.0-92.2); Fig 3 illustrates the comparison. Each case subtype was similarly correlated between TM and WSI; inflammatory cases are 95.5% and 96.1%, melanocytic cases are 82.8% and 80.0%, and nonmelanocytic cases are 90.2% and 93.4% concordant. Table IV compares the concordance and respective 95% CI. Qualitative feedback was received regarding longer diagnostic interpretation time for WSI, however, this was not quantified. No other feedback was received regarding slide quality or networking difficulties.
DISCUSSION WSI offers many opportunities for routine dermatopathology interpretation, consultation, frozen section interpretation, and education. Limitations of
WSI include implementation cost, scanning and evaluation time, image resolution, and data management.32 However, recent advancements in technology and a responsibility to reduce health care costs have propelled interest in the potential value and accessibility of telepathology. Our intraobserver concordance (TM vs WSI) of 86.9% is supported by previously published series. Mooney et al15 reviewed 20 cases among 10 dermatopathologists with 85% agreement with no significant difference between methodologies. Al-Janabi et al16 found a 94% concordance with 6 dermatopathologists, with no major discordance in a series of 100 mixed cases. Unlike our series, no melanomas were evaluated with a nonmelanocytic predominant distribution of cases (58% of cases). In addition, no TM intraobserver or interobserver concordance was recorded.
6 Shah et al
J AM ACAD DERMATOL
n 2016
Fig 2. Comparison of intraobserver concordance; traditional microscopy (TM ) versus whole slide imaging (WSI ) and TM versus TM. Forest plot shows TM versus WSI (blue) and TM versus TM (red ) compared across each of the case subtypes. The center marker corresponds with the calculated concordance with respective high and low 95% confidence intervals (CI). Overlapping CI show no statistical difference between modalities.
Fig 3. Comparison of interobserver concordance; traditional microscopy (TM ) versus TM and whole slide imaging (WSI ) versus WSI. Forest plot shows TM versus TM (blue) and WSI versus WSI (red ) compared across each of the case subtypes. The center marker corresponds with the calculated concordance with respective high and low 95% confidence intervals (CI). Overlapping CI show no statistical difference between modalities.
A few published series have specifically evaluated multiple case subcategories. In our series, inflammatory cases had the highest intraobserver agreement, 96.1%, with only 1 major discordance. This discordance was seen with a psoriasiform dermatitis diagnosed as inflamed verrucal keratosis by 1 observer. With high-quality clinical information, photographs, and direct communication we believe the overall concordance can be further improved. A study by Massone et al11 evaluated inflammatory lesions and reported a 75% concordance with clinical information and 65% without. Leinweber et al10 evaluated melanocytic lesions finding a 93.2%
concordance, with 92% specificity when evaluating melanoma. This study used a binary (benign or malignant) reporting system, an impractical interpretation of cases. We found 75.6% concordance with a 2.8% (N = 5) major disagreement per reviewer. Using a binary system and comparing only major disagreements, this study’s overall intraobserver concordance between TM versus WSI was 97.4% (95% CI 95.6-98.5) with a corresponding TM versus TM intraobserver concordance of 98.6% (95% CI 96.6-99.5). Of the melanocytic and nonmelanocytic major disagreements recorded, 60% (6 of 10) of the WSI diagnoses were comparatively less malignant
J AM ACAD DERMATOL
Shah et al 7
VOLUME jj, NUMBER j
Table IV. Comparison interobserver concordance by case subtype; traditional microscopy versus traditional microscopy and whole slide imaging versus whole slide imaging Case subtype
Concordance TM, % (95% CI)
Inflammatory Melanocytic Nonmelanocytic Overall
95.5 82.8 90.2 89.5
(91.1-97.9) (76.3-87.8) (84.7-93.9) (86.5-91.9)
Concordance WSI, % (95% CI)
96.1 80.0 93.4 89.9
Total interpretations
(91.8-98.3) (73.3-85.4) (88.6-96.4) (87.0-92.2)
180 180 183 543
The cases per subtype are listed with the concordance and 95% CI and total interpretations rendered by the 3 study pathologists. CI, Confidence intervals; TM, traditional microscopy; WSI, whole slide imaging.
Fig 4. Nonmelanocytic discordant case 35. Squamous cell carcinoma, a highly discordant case in both traditional microscopy and whole slide imaging (WSI) modalities. A, WSI inset that allows users to select regions to evaluate. B, Actinic keratosis morphology. C, Inset; different piece of tissue, consensus diagnosis squamous cell carcinoma. (A-C, Hematoxylin-eosin stain; original magnifications: A, 31; B, 3100; C, 3200.)
than the corresponding TM diagnoses. For example, 1 reviewer’s glass diagnosis was invasive melanoma whereas the digital diagnosis was nevus with moderate atypia. This may be a result of difficulty evaluating the cytology of melanocytes with WSI. Nielsen et al14 evaluated 90 melanocytic and nonmelanocytic lesions with concordance in 89.2% of cases. Interestingly actinic keratoses were a common cause of discordance. Similarly, our nonmelanocytic intraobserver concordance is 89.1% and of the 6.6% (N = 12) minor and 4.3% (N = 8) major disagreements, actinic keratoses are responsible for 75% (N = 8) and 25% (N = 2) of the minor and major disagreements, respectively. Cases 35 (Fig 4) and 75 caused discordance, as multiple reviewers were unable to reproduce their TM diagnosis, likely because of variable atypia within the specimens. Interestingly, there is also high TM interobserver variability within these 2 cases. Rereview of both cases confirmed the study consensus diagnoses of squamous cell carcinoma (case 35) and inflamed seborrheic keratosis (case 75).
In addition, we found TM intraobserver concordance (TM vs TM) was 90.3%, consistent with previously reported studies.8,10,14,15 Melanocytic cases had 2 major disagreements, compared with 3 nonmelanocytic. Inflammatory cases had no major disagreements with TM. Finally, TM interobserver (89.5%, 95% CI 86.5-91.9) and WSI interobserver (89.9%, 95% CI 87.0-92.2) concordance are statistically similar, suggesting the methodology does not impact the observers’ interpretation of a case. Recently, consensus statements have been issued to standardize validation methodologies. The consensus opinion suggests that all laboratories using WSI for clinical purposes should perform their own validation study with entirely scanned slides, while accurately documenting the process and approval of validation (Table I). WSI studies using newly released guidelines have been published revealing statistical concordance in other subspecialties, ranging from 86% to 96%.2,30 To our knowledge, we present the first study adherent to the CAP guidelines.
8 Shah et al
Limitations of this study and application in clinical practice include an exclusion of IHC stains or additional tissue levels, which could have altered agreement. Per the CAP guidelines, we sought to only validate hematoxylin-eosin and use a second validation for IHC. In addition, the lack of clinical photographs or communication with the referring physician inhibits complete interpretation of the case. A more robust electronic health record would typically be available if using WSI for clinical use. Efficiencies related to scanning and interpretation between methods was not evaluated. The cases were scanned at 340, which takes longer to scan than at lower magnification (ie, 320). Scanning at lower magnifications would potentially increase efficiency but would require revalidation. Qualitatively, some reviewers found the digital review to be delayed and cumbersome with the use of a traditional mouse. Velez et al12 evaluated 15 cases with a primary aim of reducing time to diagnosis, via various digital interfaces. The study found TM evaluation until diagnosis time was 34 seconds, whereas their WSI developed interface evaluation time was 37 seconds per slide. A possible solution could be use of a touch pad or touch screen for an increasingly natural, gesture-based evaluation of digital images.20 Consultation cases may be an avenue of further investigation, although many legal and licensing issues remain a problem in most states for primary diagnosis.33 The Food and Drug Administration released technical guidelines for WSI devices, recognizing devices as class II (moderate risk), a significant change from the initially proposed class III (high risk).34 Despite these initial steps, no current professional reimbursement exists for WSI diagnoses.35 The strengths of this study include the first to our knowledge to use a standardized validation methodology in teledermatopathology, a diverse case mix compared with previously published series, and complete evaluation of intraobserver and interobserver variability. Conclusion WSI has strong overall diagnostic concordance across the spectrum of common entities in dermatology. With an informative clinical impression, concordance is especially high with inflammatory processes. Melanocytic cases and nonmelanocytic cases require care when evaluating melanoma and precursor keratinocytic lesions, but the diagnostic variability with WSI is similar to intraobserver variability with tradition glass slides. Interobserver evaluation of TM and WSI reveals no bias introduced by WSI. WSI can be reliably used as a diagnostic
J AM ACAD DERMATOL
n 2016
modality in teledermatopathology. Future studies could further investigate user-technology and feasibility limitations. REFERENCES 1. Weinstein RS, Bhattacharyya AK, Graham AR, Davis JR. Telepathology: a ten-year progress report. Hum Pathol. 1997; 28:1-7. 2. Pantanowitz L, Sinard JH, Henricks WH, et al. Validating whole slide imaging for diagnostic purposes in pathology: guideline from the College of American Pathologists Pathology and Laboratory Quality Center. Arch Pathol Lab Med. 2013;137: 1710-1722. 3. Al Habeeb A, Evans A, Ghazarian D. Virtual microscopy using whole-slide imaging as an enabler for teledermatopathology: a paired consultant validation study. J Pathol Inform. 2012;3:2. 4. Berman B, Elgart GW, Burdick AE. Dermatopathology via a still-image telemedicine system: diagnostic concordance with direct microscopy. Telemed J. 1997;3:27-32. 5. Della Mea V, Puglisi F, Forti S, et al. Expert pathology consultation through the Internet: melanoma versus benign melanocytic tumors. J Telemed Telecare. 1997;3:17-19. 6. Weinstein LJ, Epstein JI, Edlow D, Westra WH. Static image analysis of skin specimens: the application of telepathology to frozen section evaluation. Hum Pathol. 1997;28:30-35. 7. Okada DH, Binder SW, Felten CL, Strauss JS, Marchevsky AM. ‘‘Virtual microscopy’’ and the Internet as telepathology consultation tools: diagnostic accuracy in evaluating melanocytic skin lesions. Am J Dermatopathol. 1999;21:525-531. 8. Piccolo D, Soyer HP, Burgdorf W, et al. Concordance between telepathologic diagnosis and conventional histopathologic diagnosis: a multiobserver store-and-forward study on 20 skin specimens. Arch Dermatol. 2002;138:53-58. 9. Morgan MB, Tannenbaum M, Smoller BR. Telepathology in the diagnosis of routine dermatopathologic entities. Arch Dermatol. 2003;139:637-640. 10. Leinweber B, Massone C, Kodama K, et al. Teledermatopathology: a controlled study about diagnostic validity and technical requirements for digital transmission. Am J Dermatopathol. 2006;28:413-416. 11. Massone C, Soyer HP, Lozzi GP, et al. Feasibility and diagnostic agreement in teledermatopathology using a virtual slide system. Hum Pathol. 2007;38:546-554. 12. Velez N, Jukic D, Ho J. Evaluation of 2 whole-slide imaging applications in dermatopathology. Hum Pathol. 2008;39: 1341-1349. 13. Koch LH, Lampros JN, Delong LK, Chen SC, Woosley JT, Hood AF. Randomized comparison of virtual microscopy and traditional glass microscopy in diagnostic accuracy among dermatology and pathology residents. Hum Pathol. 2009;40: 662-667. 14. Nielsen PS, Lindebjerg J, Rasmussen J, Starklint H, Waldstrom M, Nielsen B. Virtual microscopy: an evaluation of its validity and diagnostic performance in routine histologic diagnosis of skin tumors. Hum Pathol. 2010;41:1770-1776. 15. Mooney E, Hood AF, Lampros J, Kempf W, Jemec GB. Comparative diagnostic accuracy in virtual dermatopathology. Skin Res Technol. 2011;17:251-255. 16. Al-Janabi S, Huisman A, Vink A, et al. Whole slide images for primary diagnostics in dermatopathology: a feasibility study. J Clin Pathol. 2012;65:152-158. 17. Gimbel DC, Sohani AR, Prasad Busarla SV, et al. A static-image telepathology system for dermatopathology consultation in East Africa: the Massachusetts General Hospital experience. J Am Acad Dermatol. 2012;67:997-1007.
J AM ACAD DERMATOL
Shah et al 9
VOLUME jj, NUMBER j
18. Mooney E, Kempf W, Jemec GB, Koch L, Hood A. Diagnostic accuracy in virtual dermatopathology. J Cutan Pathol. 2012;39: 758-761. 19. Speiser JJ, Hughes I, Mehta V, Wojcik EM, Hutchens KA. Mobile teledermatopathology: using a tablet PC as a novel and cost-efficient method to remotely diagnose dermatopathology cases. Am J Dermatopathol. 2014;36:54-57. 20. Lehman JS, Gibson LE. Smart teledermatopathology: a feasibility study of novel, high-value, portable, widely accessible and intuitive telepathology methods using handheld electronic devices. J Cutan Pathol. 2013;40:513-518. 21. Dunn BE, Almagro UA, Choi H, et al. Dynamic-robotic telepathology: Department of Veterans Affairs feasibility study. Hum Pathol. 1997;28:8-12. 22. Halliday BE, Bhattacharyya AK, Graham AR, et al. Diagnostic accuracy of an international static-imaging telepathology consultation service. Hum Pathol. 1997;28:17-21. 23. Gilbertson JR, Ho J, Anthony L, Jukic DM, Yagi Y, Parwani AV. Primary histologic diagnosis using automated whole slide imaging: a validation study. BMC Clin Pathol. 2006;6:4. 24. Wilbur DC, Madi K, Colvin RB, et al. Whole-slide imaging digital pathology as a platform for teleconsultation: a pilot study using paired subspecialist correlations. Arch Pathol Lab Med. 2009;133:1949-1953. 25. Zembowicz A, Ahmad A, Lyle SR. A comprehensive analysis of a web-based dermatopathology second opinion consultation practice. Arch Pathol Lab Med. 2011;135:379-383. 26. Campbell WS, Lele SM, West WW, Lazenby AJ, Smith LM, Hinrichs SH. Concordance between whole-slide imaging and
27.
28.
29.
30.
31.
32. 33.
34.
35.
light microscopy for routine surgical pathology. Hum Pathol. 2012;43:1739-1744. Riedl E, Asgari M, Alvarez D, Margaritescu I, Gottlieb GJ. A study assessing the feasibility and diagnostic accuracy of real-time teledermatopathology. Dermatol Pract Concept. 2012;2:202a02. Brick KE, Sluzevich JC, Cappel MA, DiCaudo DJ, Comfere NI, Wieland CN. Comparison of virtual microscopy and glass slide microscopy among dermatology residents during a simulated in-training examination. J Cutan Pathol. 2013;40:807-811. Bauer TW, Slaw RJ. Validating whole-slide imaging for consultation diagnoses in surgical pathology. Arch Pathol Lab Med. 2014;138:1459-1465. Bauer TW, Schoenfield L, Slaw RJ, Yerian L, Sun Z, Henricks WH. Validation of whole slide imaging for primary diagnosis in surgical pathology. Arch Pathol Lab Med. 2013;137:518-524. Newcombe RG. Two-sided confidence intervals for the single proportion: comparison of seven methods. Stat Med. 1998;17: 857-872. Massone C, Brunasso AM, Campbell TM, Soyer HP. State of the art of teledermatopathology. Am J Dermatopathol. 2008;30:446-450. Giambrone D, Rao BK, Esfahani A, Rao S. Obstacles hindering the mainstream practice of teledermatopathology. J Am Acad Dermatol. 2014;71:772-780. Today C. FDA open to whole-slide imaging as class II device. Available at: http://www.captodayonline.com/fda-open-wholeslide-imaging-class-ii-device/. Accessed October 3, 2016. Montalto MC. An industry perspective: an update on the adoption of whole slide imaging. J Pathol Inform. 2016;7:18.