The role for AI in evaluation of upper GI cancer

The role for AI in evaluation of upper GI cancer

Journal Pre-proof The role for AI in evaluation of upper GI cancer Tomohiro Tada , Toshiaki Hirasawa , Toshiyuki Yoshio PII: DOI: Reference: S1096-2...

599KB Sizes 0 Downloads 24 Views

Journal Pre-proof

The role for AI in evaluation of upper GI cancer Tomohiro Tada , Toshiaki Hirasawa , Toshiyuki Yoshio PII: DOI: Reference:

S1096-2883(19)30072-5 https://doi.org/10.1016/j.tgie.2019.150633 YTGIE 150633

To appear in:

Techniques in Gastrointestinal Endoscopy

Received date: Revised date: Accepted date:

2 June 2019 14 July 2019 2 August 2019

Please cite this article as: Tomohiro Tada , Toshiaki Hirasawa , Toshiyuki Yoshio , The role for AI in evaluation of upper GI cancer, Techniques in Gastrointestinal Endoscopy (2019), doi: https://doi.org/10.1016/j.tgie.2019.150633

This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain. © 2019 Published by Elsevier Inc.

The role for AI in evaluation of upper GI cancer

Authors:

Tomohiro Tada - the institute of gastroenterology & proctology Saitama, JAPAN* Toshiaki Hirasawa – Cancer Institute Hospital Toshiyuki Yoshio – Cancer Institute Hospital

*Corresponding author [email protected];[email protected]

Summary With the application of artificial intelligence (AI) in deep learning, it has become possible to develop an AI that can be used clinically even in the field of upper endoscopy. which has been said to be difficult to diagnose. This review summarizes current studies on upper gastrointestinal tract based on AI and deep learning. At present, AI research on gastric cancer detection, H. pylori infection diagnosis, and esophagus cancer detection is progressing, and there is a possibility that AI can be used to assist in diagnosing invasion of depth of gastric caners and esophageal cancers. The studies reviewed show evidence that the use of AI in diagnosing cancer the upper gastrointestinal tract as well as in the lower gastrointestinal tract, where research has been advanced, will be introduced to the clinical site in a form that contributes to detecting suspected cancer lesions, determining treatment policies, and improving examination accuracy.

1. Introduction

In recent years, the image diagnostic capability of artificial intelligence (AI) has been found to surpass that of human beings owing to three functions: deep learning (in other words CNN: convolutional neural network), high-performance computer (GPU), and increasingly vast amount of digitized image data. AI has particularly been introduced in medicine through the utilization of diagnostic imaging. Image classification and image detection AIs have been developed in diagnosing skin cancer, diabetic retinopathy, and colonic polyps. [1–3]. This paper discusses the latest research findings of imaging diagnostic AI for upper gastrointestinal tract cancer using deep learning. It provides an overview of the role of AI in such diagnoses and discusses future directions.

2. Gastric Cancer Diagnosis

Gastric cancer often arises from atrophic gastritis. Thus, when gastric cancer resembles gastritis, it is sometimes difficult to detect in the early stages. The false-negative rate for detecting gastric cancer with esophagogastroduodenoscopy (EGD) is 4.6–25.8% [4–7]. Furthermore, inexperienced endoscopists tend to overlook early gastric cancer because it often shows only subtle morphological changes that are difficult to distinguish from background mucosa with atrophic gastritis [4, 5]. AI is expected to serve as a tool that compensates for such disparities among human observers. There are few reports of AI using deep learning in the endoscopic diagnosis of gastric cancer because such diagnosis is more difficult than the diagnosis of cancer in other organs such as esophageal cancer and colorectal cancer.

Hirasawa et al. reported the world’s first instance of an AI system for detecting gastric cancer by using AI with deep learning [8] (Fig. 1). The training image dataset comprised 13,584 high-quality endoscopic images of gastric site were collected from 2,639 histologically proven

gastric cancer patients. All images were marked manually by an expert on gastric cancer and linked to clinical data. The AI system was verified using 2,296 endoscopic images of 69 consecutive cases (77 lesions) of gastric cancer. The AI system detected 71 of the 77 gastric cancer lesions, a sensitivity of 92.2%. Of the 6 gastric cancer lesions that the AI system could not detect, 5 lesions were minute lesions of diameter 5 mm or less. The AI detected 70 lesions out of 71 (98.6% sensitivity) were detected when limited to gastric cancer of diameter 6 mm or more. The time required to analyze 2,296 images was 47 seconds (0.02 seconds per image), and the analysis speed was so fast that it could not be compared to that of humans. The positive predictive value (PPV) was 30.6%; in other words, 69.4% of the lesions diagnosed as gastric cancer by the AI system were benign. The most common reasons for misdiagnosis were gastritis with atrophy and intestinal metaplasia. These findings are sometimes difficult even for experienced endoscopists to distinguish from gastric cancer.

The study by Ishioka et al. trained AI to diagnose early gastric cancer using still images; a study using videos of 68 cases of early gastric cancer was also conducted [9]. The AI system detected 64 early gastric cancers out of 68 (94.1%) from the videos, which were at the same level as reported in the still images. Also, it took only 1 second (median time) for the AI to recognize a lesion as cancerous after it appeared on the screen.

Recently, another AI system for detecting gastric cancer was reported by Wu et al.[10]. The researchers constructed an AI system, using 3,170 still gastric cancer images and 5,981 benign still images. The verification images used 200 independent images. Wu et al.’s AI system had 94.0% sensitivity, 91.0% specificity, 92.5% accuracy, a 91.3% PPV, and a 93.8% negative predictive value (NPV) for gastric cancer detection. This result outperformed 21 endoscopists. These were still picture studies; however, it is suggested that improving PPV may be possible by using a new CNN-based AI algorithm.

Zhu [11] recently reported AI diagnosis of the invasion depth of gastric cancer. The researcher developed an AI that discriminated the depth of gastric cancer from a depth of penetration sm1 to a depth of sm2, using 790 gastric cancer images. The AI in this test was 89.1% accurate, exceeding the average of 77.5% of endoscopic doctors. Because the treatment policy of endoscopic resection or surgery changes depending on whether the depth of invasion is sm1 or deeper, AI may also be possible used to support endoscopists not only in cancer detection but also in deciding on a treatment plan.

3. AI’s role in H pyroli infection

An AI system was also reported that diagnoses the presence or absence of H. pylori infection from endoscopic images. Shichijo et al. constructed an AI system by employing a deep learning method with 32,205 training images from 735 H. pylori positive 735 cases and 1,015 H. pylori negative cases [12]. The verification images used 11,481 images from 397 independent cases. The AI system showed 88.9% sensitivity, 87.4% specificity, and 87.7% accuracy for H. pylori infection. Meanwhile, 23 endoscopists averaged 79.0% sensitivity, 83.2% specificity, and 82.4% accuracy. Therefore, the AI system performed at a more-than-equivalent level with the endoscopists. The average time required for diagnosis in 397 cases was 198 seconds for the AI system, a considerably shorter duration than the 230 minutes for the human endoscopists.

Shichijo et al. also constructed an AI system to differentiate not only H. pylori negative and positive cases but also cases before and after H. pylori eradication [13]. The researchers trained the AI using 98,564 training images from 742 H. pylori positive cases, 3,649 H. pylori negative cases, and 845 H. pylori-eradicated cases. The verification images used 23,699 images from 847 independent cases. The AI system revealed 80% accuracy for negative diagnoses, 84% accuracy for eradication, and 48% accuracy for positive diagnoses.

4. Gastric Cancer Screening

The use of AI is not completely helpful in the detection of cancer unless all parts of the stomach are clearly shown. If AI recognizes the anatomic site of the stomach, it will be possible to check if the stomach could be observed completely. Takiyama et al. constructed an AI to classify the anatomical location of the upper digestive tract [14]. A total of 27,335 images were classified as portraying the pharynx, esophagus, upper stomach, middle stomach, lower stomach, or duodenum, and were learned by the AI. ROC-AUC values were as good as 1.0 for the pharynx and esophagus and 0.99 for the stomach and duodenum.

Wu et al. developed an AI, WISENSE (now renamed ENDOANGEL), to perform a real-time stomach site check [15]. They collected images of 26 typical sites inside the stomach and trained them to AI. If AI recognized the 26 sites during the examination, it was determined that the sites were observed. They conducted a clinical trial in 324 patients and reported that using the AI at the time of gastroscopic examination resulted in a 15% reduction in missed sites. It is expected that AI, which assists in the comprehensiveness (check all site of stomach) at the time of gastric observation.

5. Esophageal Cancer Diagnosis

Esophageal cancer is the eighth most common cancer worldwide and the sixth most common cause of cancer-related mortality [16]. Esophageal squamous cell carcinoma (ESCC) is common histological type in Asia (particularly Japan), Middle East, Africa, and South America, while the incidence of esophageal adenocarcinoma (EAC) is increasing in the United States and Europe [17].

When esophageal cancer is diagnosed at an advanced stage, it requires a highly invasive treatment, and its prognosis is poor. Therefore, early detection is of great importance. However, it is difficult to diagnose esophageal cancer in early stage by conventional endoscopy using white light imaging (WLI). Iodine staining has been used to detect ESCC in high-risk patients; however, it is associated with problems such as chest pain or discomfort and increased procedural time [18].

Narrow band imaging (NBI) is a revolutionary technology of image-enhanced endoscopy that has facilitated more frequent detection of superficial ESCC without using iodine staining [19-22]. NBI is superior to iodine staining for screening endoscopy because of the ease of use (one pushes one button) and the lack of discomfort for patients. However, NBI has demonstrated an insufficient sensitivity of 53% for detecting ESCC when used by inexperienced endoscopists [23], indicating that training and experience are required to use NBI effectively.

Horie et al. reported use of the convolutional neural network (CNN)-based AI diagnosis system to detect esophageal cancer, including both ESCC and EAC [24]. The CNN was trained by 8,428 endoscopic images with esophageal cancer from 397 lesions, including 365 ESCC and 32 EAC lesions. Then they validated the CNN-based AI diagnosis system with another training data set which consisted of 1,118 images from 97 cases, including 47 cases with esophageal cancer and 50 cases without esophageal cancer. The AI diagnosis system analyzed the 1,118 images in 27seconds and detected 98% (46/47) of the cases of esophageal cancer. Notably, the AI diagnosis system could detect all smallest 7 lesions less than 10 mm. The sensitivity of the AI diagnosing system in scanning each image was 77%, with specificity of 79%, PPV of 39%, and NPV of 95%. Moreover, the AI diagnosing system could diagnose cancer as either superficial or advanced cancer with 98% accuracy. Although PPV was quite low, it seems that deeper learning would overcome this limitation. Because the analyzing speed of this system is fast enough, this system will work in medical videos, which makes it possible to use the AI diagnosing system during screening endoscopy and support endoscopists not to miss ESCCs.

With same strategy it is possible to develop the CNN to detect superficial pharyngeal cancer, which is increasingly detected in upper endoscopy screening, because the pharynx is covered with the same squamous cells and known to have similar squamous cell carcinoma to that of the esophagus. Recent advances in NBI and increased awareness among endoscopists have led to increased detection of superficial pharyngeal cancer [19,25]. If detected in the early stage, pharyngeal cancer can be treated by endoscopic resection, which facilitates safe and effective minimal invasive treatment and good treatment outcomes [26-28]. However, it is still difficult to detect them, and AI support systems must be useful in detecting pharyngeal cancers.

Intra-papillary capillary loops (IPCLs) are micro vessels that were first characterized using magnifying endoscopy [29,30] in ESCC. They are now used as established markers to diagnose ESCC, and changes in their morphology correlate with the invasion depth of ESCC [29-31]. Two classifications were reported by Inoue et al. [29,30] and by Arima et al. [31]. Then, the recent Japanese Endoscopic Society (JES) IPCL classification was reported as a comparably simplified system, allowing easy recognition of ESCC and its invasion depth [32,33]. The JES classification has become widely accepted with high accuracy to a given histological diagnosis and invasion depth of ESCC. Everson et al. reported AI classification of IPCL patterns for ESCC [34]. The researchers used 7,046 magnifying endoscopy with NBI (ME-NBI) images from 17 patients (10 ESCC, 7 normal) to train their CNN. Their CNN could differentiate abnormal and normal IPCL patterns with 93.7% accuracy; the sensitivity and specificity for classifying abnormal IPCL patterns were 89.3% and 98%, respectively.

Nakagawa et al. also reported AI diagnosis of invasion depth of ESCC in which the CNN was trained by 8,660 non-magnified images and 5,678 magnified images [35]. Also, the CNN differentiated pathologic mucosal and submucosal microinvasive (SM1) cancers from submucosal deep invasive (SM2/3) cancers from 914 validation images from 155 patients with 91.0% accuracy. Using the same validation images, 16 experienced endoscopists diagnosed the

invasion depth with 89.6% accuracy. This AI system showed favorable performance when diagnosing invasion depth with comparable performance to that of experienced endoscopists.

The endocytoscopic system (ECS) is a magnifying endoscopic examination that enables the observation of the surface epithelial cells in real time in vivo via vital staining using compounds such as methylene blue [36-40]. The optical magnification power of the newest ECS is × 500 [37], which can be increased up to × 900 by using digital magnification incorporated in the video processor. Kumagai et al. proposed a type classification for the squamous epithelium [38,39] and reported a diagnostic accuracy of approximately 95.0% in distinguishing benign and malignant lesions by using ECS [40]. The researchers used CNN-based AI to diagnose ESCC by ECS images [41]. They trained using 4,715 ECS images of the esophagus (1,141 malignant and 3,574 non-malignant images) and evaluated diagnostic accuracy with 55 cases (27 ESCC and 28 benign esophageal lesions). The AI correctly diagnosed 92.6% (25/27) of ESCC cases with a specificity of 89.3% and overall accuracy of 90.9%. They concluded that AI is expected to support endoscopists in diagnosing ESCC based on ECS images without biopsy-based histological references.

6. Conclusion The upper endoscopic diagnosis will vary greatly depending on the AI used. If endoscopists use AI to detect gastric cancer during upper gastrointestinal examination, they may be less likely to miss a cancer. Endoscopic examinations by the non-skilled endoscopists together with the specialist level AI are expected to shorten the time required for the non-skilled endoscpists diagnostic technique training.

In the upper gastrointestinal tract AI has made great progress in the diagnosis of gastric cancer, Helicobacter pylori infection, and esophageal cancer. Not only does AI support the detection of

cancer and the determination of the depth of invasion; it also suggested that AI may be used to check for holes in the stomach. In the near future, prospective trials will reveal that AI is useful in upper GI cancer detection.

Conflict of Interest: Tomohiro Tada is a shareholder of AI Medical Service Inc.. Toshiaki Hirasawa and Toshiyuki Yoshio have no conflicts of interest to declare. Acknowledgements We thank AI Medical Service Inc. for providing figures.

References

1. Esteva A, Kuprel B, Novoa RA, et al. Dermatologist-level classification of skin

cancer with deep neural networks. Nature. 2017;542:115-118. 2. Gulshan V, Peng L, Coram M, et al. Development and Validation of a Deep

Learning Algorithm for Detection of Diabetic Retinopathy in Retinal Fundus Photographs. JAMA. 2016;316:2402-2410. 3. Byrne MF, Chapados N, Soudan F, et al. Real-time differentiation of adenomatous

and hyperplastic diminutive colorectal polyps during analysis of unaltered videos of standard colonoscopy using a deep learning model. Gut. 2019;68:94-100. 4. Hosokawa O, Hattori M, Douden K, et al. Difference in accuracy between

gastroscopy and colonoscopy for detection of cancer. Hepatogastroenterology. 2007;54:442–4. 5. Hosokawa O, Tsuda S, Kidani E, Shirasaki S, et al. Diagnosis of gastric cancer up

to three years after negative upper gastrointestinal endoscopy. Endoscopy. 1998;30:669–74. 6. Amin A, Gilmour H, Graham L, et al. Gastric adenocarcinoma missed at

endoscopy. J R Coll Surg Edinb. 2002;47:681–4. 7. Menon S, Trudgill N. How commonly is upper gastrointestinal cancer missed at

endoscopy? A meta-analysis. Endosc Int Open. 2014;2:E46–50. 8. Hirasawa T, Aoyama K, Tanimoto T, et al. Application of artificial intelligence

using a convolutional neural network for detecting gastric cancer in endoscopic image. Gastric Cancer. 2018 ;21:653-660. 9. Ishioka M, Hirasawa T, Tada T. Detecting gastric cancer from video images using

convolutional neural networks. Dig Endosc. 2019;31:e34-e35.

10. Wu L, Zhou W, Wan X, et al. A deep neural network improves endoscopic

detection of early gastric cancer without blind spots. Endoscopy. 2019 Mar 12. doi: 10.1055/a-0855-3532. [Epub ahead of print] 11. Zhu Y, Wang QC, Xu MD, et al. Application of convolutional neural network in

the diagnosis of the invasion depth of gastric cancer based on conventional endoscopy. Gastrointest Endosc. 2019 ;89:806-815 12. Shichijo S, Nomura S, Aoyama K, et al. Application of Convolutional Neural

Networks in the Diagnosis of Helicobacter pylori Infection Based on Endoscopic Images. EBioMedicine. 2017; 25: 106-11. 13. Shichijo S, Endo Y2, Aoyama K, et al. Application of convolutional neural

networks for evaluating Helicobacter pylori infection status on the basis of endoscopic images. Scand J Gastroenterol. 2019 ;54:158-163. 14. Takiyama H, Ozawa T, Ishihara S, et al. Automatic anatomical classification of

esophagogastroduodenoscopy images using deep convolutional neural networks. Sci Rep. 2018;8:7497. 15. Wu L, Zhang J, Zhou W, et al. Randomised controlled trial of WISENSE, a

real-time quality improving system for monitoring blind spots during esophagogastroduodenoscopy. Gut. 2019 Mar 11. pii: gutjnl-2018-317366. doi: 10.1136/gutjnl-2018-317366. [Epub ahead of print] 16. GLOBOCAN. Estimated cancer incidence, mortality and prevalence worldwide in

2012. International Agency for Research on Cancer– World Health Organization. 2012. Available at: http://globocan.jarcfr/Pages/fact_sheets cancer.aspx. Accessed March 13, 2018. 17. Enzinger PC, Mayer RJ. Esophageal cancer. N Engl J Med. 2003; 349: 2241–52 18. Shimizu Y, Omori T, Yokoyama A, et al. Endoscopic diagnosis of early squamous

neoplasia of the esophagus with iodine staining: high-grade intra-epithelial

neoplasia turns pink within a few minutes. J. Gastroenterol. Hepatol. 2008; 23: 546–50. 19. Muto M, Minashi K, Yano T et al. Early detection of superficial squamous cell

carcinoma in the head and neck region and esophagus by narrow band imaging: a multicenter randomized controlled trial. J. Clin. Oncol. 2010; 28: 1566–72. 20. Nagami Y, Tominaga K, Machida H, et al. Usefulness of non-magnifying

narrow-band imaging in screening of early esophageal squamous cell carcinoma: a prospective comparative study using propensity score matching. Am J Gastroenterol 2014;109:845-54. 21. Lee YC, Wang CP, Chen CC, et al. Transnasal endoscopy with narrowband

imaging and Lugol staining to screen patients with head and neck cancer whose condition limits oral intubation with standard endoscope (with video). Gastrointest Endosc 2009;69:408-17. 22. Kuraoka K, Hoshino E, Tsuchida T, et al. Early esophageal cancer can be detected

by screening endoscopy assisted with narrow-band imaging (NBI). Hepatogastroenterology 2009;56:63-6. 23. Ishihara R, Takeuchi Y, Chatani R, et al. Prospective evaluation of narrow-band

imaging endoscopy for screening of esophageal squamous mucosal high-grade neoplasia in experienced and less experienced endoscopists. Dis Esoph 2010;23:480-6. 24. Horie Y, Yoshio T, Aoyama K, et al. The diagnostic outcomes of esophageal

cancer by artificial intelligence using convolutional neural networks Gastrointest Endosc. 2019 Jan;89(1):25-32. 25. Nonaka S, Saito Y. Endoscopic diagnosis of pharyngeal carcinoma by NBI.

Endoscopy 2008; 40: 347–51.

26. Shimizu Y, Yamamoto J, Kato M et al. Endoscopic submucosal dissection for

treatment of early stage hypopharyngeal carcinoma. Gastrointest. Endosc. 2006; 64: 255–9 .discussion 260–2 27. Suzuki H, Saito Y. A case of superficial hypopharyngeal cancer treated by EMR.

Jpn. J. Clin. Oncol. 2007; 37: 892. 28. Yoshio T, Tsuchida T, Ishiyama A, et al. Efficacy of double-scope endoscopic

submucosal dissection and long-term outcomes of endoscopic resection for superficial pharyngeal cancer. Dig Endosc. 2017 Mar;29(2):152-159. 29. Inoue H, Honda T, Yoshida T, et al. Ultra-high magnification endoscopy of the

normal esophageal mucosa. Dig Endosc 1996; 8: 134–138. 30. Inoue H, Honda T, Nagai K, et al. Ultra-high magnification endoscopic observation

of carcinoma in situ of the esophagus. Dig Endosc 1997; 9: 16–18. 31. Arima M, Tada M and Arima H. Evaluation of microvascular patterns of

superficial esophageal cancers by magnifying endoscopy. Esophagus 2005; 2: 191– 197. 32. Oyama T, Inoue H, Arima M, et al. Prediction of the invasion depth of superficial

squamous cell carcinoma based on microvessel morphology: Magnifying endoscopic classification of the Japan Esophageal Society. Esophagus 2017; 14: 105–112. 33. Oyama T and Momma K. A new classification of magnified endoscopy for

superficial esophageal squamous cell carcinoma. Esophagus 2011; 8: 247–251. 34. Everson M, Herrera LCGP, Li W, et al. Artificial intelligence for the real-time

classification of intrapapillary capillary loop patterns in the endoscopic diagnosis of early oesophageal squamous cell carcinoma: A proof-of-concept study UEG journal 2019, 7(2) 297-306

35. Nakagawa K, Ishihara R, Aoyama K, et al. Classification for invasion depth of

esophageal squamous cell carcinoma using a deep neural network compared with experienced endoscopists Gastrointest Endosc 2019 in press 36. Kumagai Y, Monma K, Kawada K. Magnifying chromoendoscopy of the

esophagus: in vivo pathological diagnosis using an endocytoscopy system. Endoscopy. 2004;36:590–4. 37. Kumagai Y, Takubo K, Kawada K, et al. A newly developed continuous

zoom-focus endocytoscope. Endoscopy. 2017;49(2):176–80. 38. Kumagai Y, Kawada K, Yamazaki S, et al. Endocytoscopic observation for

esophageal squamous cell carcinoma: can biopsy histology be omitted? Dis Esophagus. 2009;22:505–12. 39. Kumagai Y, Kawada K, Yamazaki S, et al. Endocytoscopic observation of

esophageal squamous cell carcinoma. Dig Endosc. 2010;22:10–6. 40. Kumagai Y, Kawada K, Higashi M, et al. Endocytoscopic observation of various

esophageal lesions at ×600: can nuclear abnormality be recognized? Dis Esophagus. 2015;28:269–75. 41. Kumagai Y, Takubo K, Kawada K, et al. Diagnosis using deep-learning artificial

intelligence based on the endocytoscopic observation of the esophagus. Esophagus. 2019 Apr;16(2):180-187.

Figure legend.

1a: A slightly yellowish and flat lesion of gastric cancer appears on the lesser curvature of the antrum.

1b: The yellow rectangular frame was marked by the CAD as a possible lesion in order to indicate the extent of a suspected gastric cancer lesion. An endoscopist manually marked the location of the cancer using a green rectangular frame. [0–IIb, 5 mm, tub1, T1a(M)]

Supplementary Video is located https://onlinelibrary.wiley.com/page/journal/14431661/den13306-sup-0001-vids1.htm

2a. The reddish irregular area of ESCC appears on the right wall. 2b. An endoscopist marked the lesion by a green square previously, and an AI diagnosing system surrounded the lesion by the white square diagnosing as esophageal cancer. Because both squares matched perfectly, the AI diagnosing system could detect ESCC correctly.

3a. The reddish irregular area of ESCC appears on the posterior wall. 3b. The AI system not only detect ESCC but also suggest that this tumor is submucosal deep invasive (SM2/3) cancer with probability of 94%.