Virtual reality and augmented reality in the management of intracranial tumors: A review

Virtual reality and augmented reality in the management of intracranial tumors: A review

Journal of Clinical Neuroscience xxx (xxxx) xxx Contents lists available at ScienceDirect Journal of Clinical Neuroscience journal homepage: www.els...

585KB Sizes 0 Downloads 128 Views

Journal of Clinical Neuroscience xxx (xxxx) xxx

Contents lists available at ScienceDirect

Journal of Clinical Neuroscience journal homepage: www.elsevier.com/locate/jocn

Review article

Virtual reality and augmented reality in the management of intracranial tumors: A review Chester Lee, George Kwok Chu Wong ⇑ Division of Neurosurgery, Department of Surgery, The Chinese University of Hong Kong, Hong Kong Special Administrative Region

a r t i c l e

i n f o

Article history: Received 28 November 2018 Accepted 22 December 2018 Available online xxxx Keywords: Virtual reality Mixed reality Augmented reality Brain tumor Intracranial tumor Neurosurgery

a b s t r a c t Neurosurgeons are faced with the challenge of planning, performing, and learning complex surgical procedures. With improvements in computational power and advances in visual and haptic display technologies, augmented and virtual surgical environments can offer potential benefits for tests in a safe and simulated setting, as well as improve management of real-life procedures. This systematic literature review is conducted in order to investigate the roles of such advanced computing technology in neurosurgery subspecialization of intracranial tumor removal. The study would focus on an in-depth discussion on the role of virtual reality and augmented reality in the management of intracranial tumors: the current status, foreseeable challenges, and future developments. Ó 2018 Elsevier Ltd. All rights reserved.

1. Introduction Visualizing the anatomical structures beneath the human skin has always been a problematic issue for surgeons, especially if trying to avoid the skin incision. When it comes to brain tumors, the skull is particularly difficult to penetrate, which makes their visualization and the following excision a great challenge. The first solution came with the discovery of X-rays, and have since continually improved. Thanks to the technological progress in neurosurgery and beyond, augmented and virtual reality appeared: the first augmented monoscopic operating microscope in 1985 [1], the first augmented external display for cranial surgery in 1994 [2,3], the first augmented stereo-operating microscope in 1995 [4], the first augmented fluoroscopy display for neurosurgical endovascular surgery in 1998 [5], and the first augmented neurosurgical endoscope for endonasal transsphenoidal surgery in 2002 [6,7]. It is important distinguish between augmented reality (AR) and virtual reality (VR). The idea behind AR is overlaying reallife structures with artificial elements. The double-layered image can be displayed on many surfaces: monitors, optics like microscopes or head-mounted oculars or semi-transparent surface, as well as directly on the patient. The augmented view can be

⇑ Corresponding author at: 4/F Lui Che Wo Clinical Sciences Building, Department of Surgery, Prince of Wales Hospital, 30-32 Ngan Shing Street, Shatin, Hong Kong Special Administrative Region. E-mail address: [email protected] (G.K.C. Wong).

filled with neuroanatomical models as well as with critical clinical data [7,8]. On the other hand, VR is a full immersion of the person into a crafted, virtual world. The VR set is usually composed of a virtual patient on a physical model, a model of the pathology, surgical instruments and connector of all VR reality interface [7,8]. Both AR/VR visualization have their advantages and disadvantages, and have a role in different stages of surgery. The operator should be able to switch with ease between them [9]. This man-machine symbiotic relationship can introduce a new scope of possibilities for neurosurgery of intracranial tumors. 2. Methods 2 reviewers (CL, GW) performed a systematic literature review of 3 databases (PubMed/MEDLINE, Google Scholar and Cochrane) was performed on August 20, 2018 with the use of search term: (‘‘virtual reality” OR ‘‘augmented reality”) AND (‘‘intracranial tumor” OR ‘‘brain tumor” OR ‘‘glioma” OR ‘‘meningioma”)”. After the search 2315 articles were screened by title and abstract. The remaining 76 articles underwent a detailed review of relevance for full-text (Fig. 1). During the research we identified 15 articles on VR/AR in neurosurgery [10–21]. There has been no study focused on both modalities in the usage for intracranial tumor surgery planning, performance and post-surgical assessment. 15 trials and device protocols were included into the final analysis (Table 1) Our review follows the guidelines set by the PRISMA statement [22].

https://doi.org/10.1016/j.jocn.2018.12.036 0967-5868/Ó 2018 Elsevier Ltd. All rights reserved.

Please cite this article as: C. Lee and G. K. C. Wong, Virtual reality and augmented reality in the management of intracranial tumors: A review, Journal of Clinical Neuroscience, https://doi.org/10.1016/j.jocn.2018.12.036

2

C. Lee, G.K.C. Wong / Journal of Clinical Neuroscience xxx (xxxx) xxx

Fig. 1. Flow diagram of methodology [22].

3. Discussion/results 3.1. Surgical planning The role of imaging grows tremendously with the development of new imaging modalities and improvements in scan quality and resolution [8]. Currently, surgeons work in cooperation with radiologists and are building tumor models in their minds presurgically. AR/VR gives a unique opportunity to transform 2-D pictures into 3-D models. It provides not only the tumor structure, but also shows its detailed relationship to anatomical structures. It is most advantageous when the margin of the tumor is blurred and the exact plan of removal extent is in preparation [23]. Currently, multiple softwares for neurosurgical planning have emerged [8]. One of them is NeuroPlanner – system based on 2D and 3D atlases of human brain. Presurgically the surgeon may plan and simulate the procedure, naming stereotactic trajectory planning and targeting, mensuration, segmentation and labeling of anatomy or insertion of the electrode [24]. NeuroBase is an extension of NeuroPlanner, which enables on loading multiple dataset, both anatomical and functional. Its usefulness is way beyond intracranial tumor removal [25]. Moreover, the mixed-reality gives an opportunity to elaborate on a mathematical model, which quantifies the extent of tumorous invasion of individual gliomas in three-dimensions to a degree beyond the limits of present medical imaging. The idea of mathematical modeling off tumor growth was based on a simple observation of metastatic growth on chest X-rays, which has a constant rate of doubling [26]. Murray elaborated it to conservation equation: the rate of change of tumor cell population = the

diffusion (motility) of tumor cells + the net proliferation of tumor cells [27]. Model proposed by Swanson introduced the complex anatomy of the brain and allowed diffusion to be a part of equation [28]. As tumor cells invade peripheral to the CT- or MRI-defined boundaries of the tumor, mathematical modeling of preserved cancerous cells is useful for planning postsurgical therapy regimes [29,30]. 3.2. Surgical navigation Localization of intracranial tumors has always been a great challenge for neurosurgeons. Gleason with Colleagues described the first enhanced reality usage for neurosurgical guidance in 1994. Their technique is based on the merge of 3D pre-operative scans with live per-operative video images. The biggest step forward was the ability to see the whole tumor and its surroundings, not only a particular point in empty space [2]. In fact, the first microscope with AR for cranial surgery was reported by Dartmouth in 1985, but it was unable to track tools in real time during operation. This is a persistent problem for neuronavigation stereotactic system widely used today [1]. Neurosurgeons are heavily dependent on images during the operations. Usually, the pre-operative scans are displayed in 3 separate planes (coronal, axial, saggital). During the operation, the surgeon not only has to constantly switch their mental representations between the operation field and radiological images [19]. The first conceptual description of AR idea of 3-dimensional stereoscopic overlay of the operating field in an operating microscope appeared in 2003 by Aschke et al. [31]. They built the prototype with the source images from pre- and intra-operative MRI or

Please cite this article as: C. Lee and G. K. C. Wong, Virtual reality and augmented reality in the management of intracranial tumors: A review, Journal of Clinical Neuroscience, https://doi.org/10.1016/j.jocn.2018.12.036

C. Lee, G.K.C. Wong / Journal of Clinical Neuroscience xxx (xxxx) xxx

3

Table 1 Summary of literature. KPF – Karnofsky Performance Score. Clinical application/surgical domain

Device

PICO (populations, intervention, comparison, outcome) – if applicable

Presurgical plan for skull base tumor

Dextroscope [70]

VR in the planning of tumor resection in the sellar region

Dextroscope [71]

VR in presurgical planning for cerebral gliomas and tractography

Dextroscope [72]

VR simulator as an assessment of performance in brain tumor resection

NeuroTouch [39]

VR simulator as an assessment of bimanual performance in brain tumor resection

NeuroTouch [73]

VR simulator as assessment of force application

NeuroVR [54]

AR/VR for functional neuronavigation

Functional Neuronavigation [74]

Presurgical planning of feeder resection

Virtual operation field (VOF) by high-spatialresolution three-dimensional computer graphics (hs-3DCG) [75]

AR system to assist novice surgeons

AR goggles [76]

VR system for training simulation for a robotic surgery system VR systems as a base for mathematical modeling of glioma tumor growth AR system with back-facing camera

Robotic Operative Microscope [67]

P 48 patients with skull base tumor divided into 2 groups, I surgery with/without plan provided by Dextroscope C Tumor’s resection rate, the preoperative evaluation: duration of operation, total blood loss, the postoperative LOS, cerebrovascular injury, complications, and postoperative KPS of patients on discharge/6mo followup O The duration of operation and the postoperative LOS of each patient were 5.25 ± 0.64 h and 8.50 ± 1.10 days in the test group and 7.36 ± 0.87 h and 12.50 ± 1.52 days in the control group (P < 0.05) KPS of patients in the test group improved from discharge to the 6 month after (P < 0.05), there were adverse results in the test group (P < 0.05) P – 60 patients with sellar region tumors I – VR models of the tumor and simulation of different approaches to its removal (transmononasal sphenoid sinus, pterional, and other) C – survey questionnaire filled by 11 neurosurgeons O – models were helpful for the individualized planning of surgery in the sellar region P 45 patients with suspected glioma I – MRI (tractography, DTI, 3d) simulation C – amplitude of the number of effective fibers at affected sides, KPS at 6 months O – optimization of surgical trajectory, maximization of safe tumor removal P 71 participants (10 medical students, 18 junior residents, and 44 senior residents) I – Participants completed the internal resection of a simulated convexity meningioma C – the volume of tissues removed, tool path lengths, duration of excessive forces applied and efficient use of the aspirator O – a significant increase in a tumor removed and efficiency of ultrasonic aspirator use between medical students and residents P – 18 (6 neurosurgeons, 6 senior residents, 6 junior residents) I – Bimanual resection of 8 simulated brain tumors with differing color, stiffness, and border complexity C – blood loss, tumor percentage resected, total simulated normal brain volume removed, total tip path lengths, maximum and sum of forces used by instruments, efficiency index, ultrasonic aspirator path length index, coordination index, ultrasonic aspirator bimanual forces ratio O – metrics differentiate novice from expert neurosurgical performance P – 16 neurosurgeons, 15 residents, 84 medical students I – 18 cases on simulator C – ultrasonic aspirator force application was continually assessed during resection of simulated brain tumors O – Handedness, ergonomics, and visual and haptic tumor characteristics resulted in distinct well-defined 3D force pyramid patterns P 79 glioma patients and 55 control subjects. I – Neuronavigation based on presurgical tractography or anatomical landmarks C – resection rate, the average extent of resection, the rate of preservation of neural functions O – the complete resection subjects, with average resection rates of 95.2% ± 8.5% and average extent of resection differed between the 2 groups (P < 0.01) P – 8 cases with cerebellopontine angle meningioma I – (1) operation (2) presurgical simulation with VOF C – the point at which main feeding artery penetrated meningioma O – By using VOF, the point at which the main feeder penetrated the tumor was estimated Compared to conventional planning environments, the proposed system greatly improves the non-clinicians’ performance, independent of the sensorimotor tasks performed (p < 0.01). The time to perform clinically relevant tasks is always reduced (p < 0.05). Case study as an assessment of assessed robotic, exoscopic, endoscopic, fluorescence functionality. The velocity of expansion is linear with time and varies about 10-fold, from about 4 mm/year for low-grade gliomas to about 3 mm/month for highgrade ones. The alignment errors in the skull specimen study were 4.6 pixels (1.6 mm) and clinical experiment 6 pixels (2.1 mm). The images were overlaid in all 3 cases with the AR error was 2–3 mm for meningioma cases AR system superimposes 3-d virtual representations of the patient’s tumor and local anatomy, indicate position and direction of the endoscopic beam, color changes of wire-frame images with the moving distance between endoscope tip and tumor. A useful addition to the transsphenoidal surgery. Technical note and visualization on the model skull with mean projection error: 0.3 mm

AR system for neuronavigation AR system for neuronavigation for pituitary tumors

AR system for image-guided neurosurgery

Mathematical model [30]

Tablet-AR [77] AR system based on open-source software 3D Slicer, Polaris and web cameras [78] Endoscopic AR navigation system [6]

AR system [79]

Please cite this article as: C. Lee and G. K. C. Wong, Virtual reality and augmented reality in the management of intracranial tumors: A review, Journal of Clinical Neuroscience, https://doi.org/10.1016/j.jocn.2018.12.036

4

C. Lee, G.K.C. Wong / Journal of Clinical Neuroscience xxx (xxxx) xxx

ultrasound inserted into the microscopic beam during the operation, which enables the surgeon to focus the sight solely on the operating field [31]. Head-mounted devices such as smart glasses have been well studied and their usage was proven in the field of neurosurgery: for craniotomy, lumbar biopsy, ventriculoperitoneal shunt and endoscopy – especially useful for visualization of ultrasound image during nerve blockage, where the surgeon is no longer forced to move his sight away from the patient. As of today, there is no assessment of head-mounted devices for the removal of intracranial tumors [14]. The accuracy of presurgical mapping degrades over the course of the operation. The IBIS platform (Intra-operative Brain Imaging System) is created in 2012 an open-source image-guided neurosurgery research platform which attempts to fill this gap by validation of intraoperative ultrasound facilitated through AR. Image-topatient registration accuracy is on the order of 3.72 ± 1.27 mm and can be improved with ultrasound to a median target registration error of 2.54 mm [32]. An updated patient model can be obtained within less than 20 s by capturing tracked ultrasound images, reconstructing a 3D volume and using this volume to automatically realign preoperative plans. Its clinical application ranges from brain shift assessment, vascular neurosurgery, tumor surgery, and spine surgery to DBS/epilepsy electrode implantation [32]. 3.3. Surgical training The current healthcare environment demands the best patient outcomes with maximal cost efficiency. It heavily restricts the time of surgical resident spent under supervision of senior colleagues and disturbs the ideal Halsted’s approach to surgery education [33]. Moreover, it is very dangerous for an inexperienced operator to start immediately with a living patient, as any move could be the last. The overriding importance of patient safety, the complexity of surgical techniques, and the challenges associated with teaching surgical trainees in the operating room are all the factors driving the need for mixed reality as a new, exciting solution to these problems. Simulation provides a unique opportunity for learners to practice clinical skills in a low stakes setting before operating room (OR) experience [33]. As it is easy to calculate the number of procedures and complications, it is much harder to validate their quality. More than 75% of neurosurgical errors are deemed to be technical in nature and therefore preventable [34]. Adequate assessment and room for improvement is necessary – simulation training embrace both of the tasks. It also enhances learning by implementing mental rehearsal practice and staged motor learning. In general surgery, several randomized controlled trials have shown that virtual reality simulation not only is an efficient use of learning time, but can decrease initial and subsequent errors in the operating room [34]. In urology simulation is well integrated into training curriculum. The 10 attributes of simulation are highlighted: (1) Feedback can be provided during the learning experience, (2) Learners can use repetitive practice, (3) Simulation can be integrated, (4) Level of difficulty can be varied, (5) Simulation can be adapted to different learning strategies, (6) Clinical variation can be incorporated, (7) The environment can be controlled and changed, (8) The process can safely permit individual learning, (9) Learning outcomes can be defined and measured, (10) and The simulation can approximate clinical practice. In the end simulation assess not only technical skills, but also non-technical communication skills and ability to deal with complications [35]. Lately the concept of simulation of not only technical skills, but also team working was introduced in Crisis Management Simulation study, where neurosurgery residents were teamed with anesthesia resident and had to manage intraoperative crisis. No significant

difference between learner specialties was observed for situation awareness, decision making, communications and teamwork, or leadership evaluations. Learners reported the simulation realistic, beneficial, and highly instructive [36]. Urology is also the field with one of the earliest robotic surgery introduction more than 20 years with the daVinci system (Intuitive Surgical, Sunnyvale, US). It enables for 3D high-definition visualization, precise hand–eye coordination, physiological tremor filtering, and motion-scaling. As of 2016, some 2500 da Vinci systems have been installed and have performed almost half a million procedures (radical prostatectomy, radical cystectomy, partial nephrectomy). The training curricula on virtual reality-robotic systems with mentorships are already well established to provide excellence in procedures [35]. For general surgery robotics entered only around 2010 for procedures like cholecystectomy, bariatric surgery, hernia repair and other. Virtual reality surgical simulation has also proved to be a validated method for robotic training. It improves basic skillsets, demonstrates good content and constructs validity for laparoscopic and robotic training as well as translates into improved skills in the operating room [37]. NeuroTouch is a VR simulator developed by researchers at McGill University, Montreal, Canada, for neurosurgical procedures training. It is the work summary of 50 experts out of 20 neurosurgical centers across Canada. It is composed out of 3 main elements: high-end computer, stereovision system (binoculars which provide 3-D view) and bimanual haptic tool manipulators. The simulator components are mounted on the on a special frame, which enables to adjust the height and tilt angle to every participant. As for visualization: a Wheatstone stereoscope was designed with the usage of binocular pieces and two 17-inch LCS screens (1280  1024 px) and surface mirrors instead of lenses. The size and distance of the screens shows 30 degree of visual field and are focused around 40 cm for eye strain reduction. The haptic system is based on linkages connected by joints and in NeuroTouch simulator is based on Phantom Desktop and Freedom 6s system. The data gathered by haptic devices is called degree of freedom (DOF) and tells about the tool position (up-down, left-right, forward-backward) and orientation (azimuth, elevation, roll). Moreover, the foot pedal, tool handle sensors, dial knobs and push buttons can be incorporated to the haptic system by Arduino connection. The simulation software was developed in-house and is based on 3 asynchronous processes updating the multiple tasks (graphics, haptics and tissue mechanics) [38]. Researchers created the first conceptual framework for standardized training with NeuroTouch: Five tasks of increasing difficulty were selected as representative of basic and advanced neurosurgical skill: 1) ventriculostomy, 2) endoscopic nasal navigation, 3) tumor debulking, 4) hemostasis and 5) microdissection. Cases are under continuous progress, but the proficiency metrics are already obtained for the basic procedures [39–41]. It is worth noticing that the simulator may not only judge hard technical skills but also provide input into psychomotor ability and cognitive input [42]. Moreover, the simulator provides the opportunity to upload a particular case and practice it before the real surgery [43]. NeuroTouch is a part of Canadian ‘‘rookie camp” for junior residents available in some residency programs across country – though only for the limited time of 1–5 days per year. It has potential to compensate for the limited exposure to certain complex neurosurgical cases such as aneurysm clipping. Moreover, simulator could provide objective measurement of certain surgical skills and knowledge. However, its usage nowadays is limited. Due to nation-wide survey in Canada lack of funding is the biggest problem due to program directors whereas residents pinpoints lack of free time as a highest obstacle [44]. The summary of case performance may help studies on the acquisition and development as well as maintenance of psychomotor skills of young and experienced neurosurgeons [45].

Please cite this article as: C. Lee and G. K. C. Wong, Virtual reality and augmented reality in the management of intracranial tumors: A review, Journal of Clinical Neuroscience, https://doi.org/10.1016/j.jocn.2018.12.036

C. Lee, G.K.C. Wong / Journal of Clinical Neuroscience xxx (xxxx) xxx

The simulator enables not only to simulate the brain tumor surgery via craniotomy route, but also lets to practice endoscopic sinus approach [46]. The McGilll simulator from Canada gathered high score in both realism (7.97 ± 0.29 out of 10) and usefulness (8.57 ± 0.69 out of 10). [47] The study from Australia clearly indicated that the opinions of participants group are different based on specialty level: students and residents were more likely to give a positive realism feedback for the simulator than specialist [48]. Virtual reality training has been introduced into neurosurgery training program in USA with the help of ImmersiveTouch – the firm with more than 20 years of experience in simulation. They started as a flight simulator in the University of Chicago and currently they provide the wide range of cases: from the basic, anatomical concepts to the complex procedures and management of rare complications. It enables to upload new, patient-specific cases [34]. Surgeons can practice difficult procedures under computer control without putting a patient at risk. In addition, surgeons can practice on these simulators at any time, immune from case-volume or location limitations [15,49]. Gasco et al. for the first time assessed not only educational benefit of virtual reality simulation, but also its costs. They created curriculum for neurosurgery simulation that included basic and advanced skills. 68 core exercises were distributed in individualized sets of 30 for 6 neurosurgery residents and consisted of 79 simulations with physical models, 57 cadaver dissections, and 44 haptic/ computerized sessions. Total of 180 procedures and surveys of self-perceived performance were analyzed. Junior residents (year 1–3) reported proficiency improvements in 82% of simulations performed, whereas Senior residents (year 4–6) – in 42.5%. Initial cost of simulation system implementation is $341 978.00, with $27 876.36 for annual operational expenses [49]. In 2000 the VR system Dextroscope with VIVIAN (Virtual intracranial Visualization and Navigation) was developed for simulation of presurgical planning and procedures. It coregisters MRI, MRA and CT images into a fused 3D visualization as well as is equipped with simulation tools for VR. The registration accuracy is 1.0 ± 0.6 mm. It has proved to be useful in both research and clinical settings – especially in cranial base surgery, the simulation decreased the pre- and intraoperative guesswork and increased the surgeon’s confidence in carrying out complex procedures [50]. It proved to be useful in vascular neurosurgery: aneurysm clipping and arterio-venous malformation excision. Simulation of procedures enabled to rehearse the microsurgical procedure and select the best approach to head positioning, size of craniotomy and visualization from different angles [51,52]. Dextroscopic virtual reality stimulation can provide an illustrated preoperative planning and training for excision of cerebral AVM. Dextroscope could help to obtain an anatomical understanding of arterial feeders, nidus, and draining veins in relationship to surrounding cerebral cortex. It allowed one to see the exposure with the different angles of visualization, similar to what happened under the operative microscope [51]. To prepare for complicated procedures of the ruptured intracranial aneurysm clipping patient-specific imaging data from computed tomographic angiography of the intracranial circulation and cranium were transferred to the workstation (Dextroscope; Volume Interactions Pte. Ltd., Singapore, Singapore). An aneurysm clip database was loaded into the patient data set. 3-D volume rendering was followed by data co-registration and fusion. With the virtual head positioning and craniotomy and angle of application, it allows for assessment of different degree of obliteration of various approaches [52]. Many studies addressed the benefit of VR training system for surgeons [16,53,54], but there is only one study found on its limitation. According to study performed by Lee and Lee in 2017, which involved 32 participants proved that pupils who were learning on

5

VR set under the guidance of mentor achieve better results than unsupervised colleagues. After the simulation training, selflearners showed substantially lower simulation task scores (82.9 ± 6.0) compared with mentored group (93.2 ± 4.8). Both groups demonstrated improved physical model tasks performance in comparison with the actual robot. The mentored group exhibited lower global mental workload/distress, higher engagement, and a better understanding of methods that improve performance [55].

3.4. Further directions Current challenges in surgery of brain tumors are: selection of surgical approach, balance between extent of resection and leaving parts involved in functional networks [56]. The assessment of neural pathways is currently achieved by awake surgery or intraoperative electrophysiological monitoring and mapping [57] and researched with the use of pre-operative fMRI [58], diffusion tensor imaging [59] – for those methods brain shift during surgery is a problem [60]. Moreover, assessment of post-surgical brain tumor treatment response by MRI is fraught with pitfalls such as differentiating progression from treatment-related changes and pseudoprogression [56]. The most formidable challenge for mixed reality in the neurosurgical field is a gap between the theoretical concept in a laboratory and the real experience. Kersten-Oertel et al. showed that only 16% of the peer-reviewed articles were evaluated in the operating room and only 6% measured the impact on clinical outcome of the patient [61]. The DVV (data, visualization processing view) taxonomy has been proposed to support structured discussion in the future studies to fill this gap [62]. All the major components of AR environment are highlighted and create an understandable contact route between creators of mixed-reality system and end-users [62]. Moreover, the visualization of complex brain relationship to pathologies borders and blood supply is hard to implement, though it is constantly improving in parallel to changes in real-life brain imaging advancements (like diffusion tensor imaging) [15]. The data overload is another contemporary problem [63,64]. It is hard to synthetize all information available for the particular stage of surgery at once for a one person in the long term. Taking into consideration an insufficient supply of adequately trained specialists as compared to the increasing demands and value of their time, the easiness of mixed-reality devices implementation should be in the center of attention [65]. The main limitation for a wide implementation of neurosurgical simulators is the fact that they require physical space, teachers and large capital – for the primary investment and on-going maintenance [34]. Moreover, mixed reality training poses an opportunity to safely test and validate the automated surgery robot, like NauroBlate or PUMA [66], that will support surgeons in their work in the future [67]. We can envision systems ranging from semiautomated devices with integrated scanners and surgical arms, to a fully operational humanoid machine [68]. Additionally, the emerging serious medical gaming can become a base for future training programs [69].

4. Conclusions The technological revolution is a sign of our times. Solutions provided by the technology are embraced by medicine and gradually introduced into everyday clinical practice. This article serves as a contemporary summary of virtual and augmented environment in the field of brain tumor surgery.

Please cite this article as: C. Lee and G. K. C. Wong, Virtual reality and augmented reality in the management of intracranial tumors: A review, Journal of Clinical Neuroscience, https://doi.org/10.1016/j.jocn.2018.12.036

6

C. Lee, G.K.C. Wong / Journal of Clinical Neuroscience xxx (xxxx) xxx

References [1] Roberts DW, Strohbehn JW, Hatch JF, Murray W, Kettenberger H. A frameless stereotaxic integration of computerized tomographic imaging and the operating microscope. J Neurosurg 1986;65:545–9. [2] Gleason P, Kikinis R, Wells W, Lorensen W, Cline H, Enhanced Ettinger G, et al. Reality for neurosurgical guidance. AAAI Tech Rep 1994:239–42. [3] Gildenberg PL, Ledoux R, Cosman E, Labuz J. The Exoscope – a frame-based video/graphics system for intraoperative guidance of surgical resection. Stereotact Funct Neurosurg 1994;63:23–5. [4] Edwards PJ, Hawkes DJ, Hill DLG, Jewell D, Spink R, Strong A, et al. Augmentation of reality using an operating microscope for otolaryngology and neurosurgical guidance. J Image Guid Surg 1995;1:172–8. [5] Masutani Y, Dohi T, Yamane F, Iseki H, Takakura K. Augmented reality visualization system for intravascular neurosurgery. Comput Aided Surg 1998;3:239–47. [6] Kawamata T, Iseki H, Shibasaki T, Hori T. Endoscopic augmented reality navigation system for endonasal transsphenoidal surgery to treat pituitary tumore: technical note. Neurosurgery 2002;50:1393–7. [7] Guha D, Alotaibi NM, Nguyen N, Gupta S, McFaul C, Yang VXD. Augmented reality in neurosurgery: a review of current concepts and emerging applications. Can J Neurol Sci 2017;44:235–45. [8] Nowinski W. Virtual reality in brain intervention. Int J Artif Intell Tools 2006;15:741–52. [9] Pandya A, Auner G. Simultaneous augmented and virtual reality for surgical navigation. Annu Conf North Am Fuzzy Inf Process Soc – NAFIPS 2005;2005:429–35. [10] Pelargos PE, Nagasawa DT, Lagman C, Tenn S, Demos JV, Lee SJ, et al. Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery. J Clin Neurosci 2017;35:1–4. [11] Chan S, Conti F, Salisbury K, Blevins NH. Virtual reality simulation in neurosurgery: technologies and evolution. Neurosurgery 2013;72:154–64. [12] Spicer MA, Apuzzo MLJ, Kelly PJ, Benzel EC, Adler JR. Virtual reality surgery: neurosurgery and the contemporary landscape. Neurosurgery 2003;52:489–97. [13] Spicer MA, Van Velsen M, Caffrey JP, Apuzzo MLJ, Kelly PJ, Black PML, et al. Virtual reality neurosurgery: a simulator blueprint. Neurosurgery 2004;54:783–98. [14] Chen RE, Kim EJ, Akinduro OO, Yoon JW, Kerezoudis P, Han PK, et al. Augmented reality for the surgeon: systematic review. Int J Med Robot Comput Assissted Surg 2018;14:1–13. [15] Alaraj A, Lemole MG, Finkle JH, Yudkowsky R, Wallace A, Luciano C, et al. Virtual reality training in neurosurgery: review of current status and future applications. Surg Neurol Int 2011;2:52. [16] Bernardo A. Virtual reality and simulation in neurosurgical training. World Neurosurg 2017;106:1015–29. [17] Kin T, Nakatomi H, Shono N, Nomura S, Saito T, Oyama H, et al. Neurosurgical virtual reality simulation for brain tumor using high-definition computer graphics: a review of the literature. Neurol Med Chir (Tokyo) 2017;57:513–20. [18] Dakson A, Hong M, Clarke DB. Virtual reality surgical simulation: implications for resection of intracranial gliomas. Prog Neurol Surg 2017;30:106–16. [19] Tagaytayan R, Kelemen A, Sik-Lanyi C. Augmented reality in neurosurgery. Arch Med Sci 2018;14:572–8. [20] Yc Goha K. Virtual reality applications in neurosurgery. Conf Proc IEEE Eng Med Biol Soc 2005;4:4171–3. [21] Meola A, Cutolo F, Carbone M, Cagnazzo F, Ferrari M, Ferrari V. Augmented reality in neurosurgery: a systematic review. Neurosurg Rev 2017;40:537–48. [22] Hutton B, Salanti G, Caldwell DM, Chaimani A, Schmid CH, Cameron C, et al. The PRISMA extension statement for reporting of systematic reviews incorporating network meta-analyses of health care interventions: checklist and explanations. Ann Intern Med 2015;162:777. [23] Black PML. Hormones, radiosurgery and virtual reality: New aspects of meningioma management. Can J Neurol Sci 1997;24:302–6. [24] Nowinski WL, Yang Guo Liang, Yeo Tseng Tsai. Computer-aided stereotactic functional neurosurgery enhanced by the use of the multiple brain atlas database. IEEE Trans Med Imaging 2000;19:62–9. [25] Yang GL, Guo HH, Huang S, Padmanabhan R, Nowinski WL. NeuroBase: a brain atlas-based, multi-platform, multi-dataset-processing neuroimaging system. In: Mun SK (ed). 2000, pp 77–88. [26] Collins VP, Loeffler RK, Tivey H. Observations on growth rates of human tumors. Am J Roentgenol Radium Ther Nucl Med 1956;76:988–1000. [27] Murray JD. Mathematical biology: I. An introduction, third edition. 3rd ed. Springer; 2000. http://www.ift.unesp.br/users/mmenezes/mathbio.pdf. [28] Giese A, Kluwe L, Laube B, Meissner H, Berens ME, Westphal M. Migration of human glioma cells on myelin. Neurosurgery 1996;38:755–64. [29] Swanson KR, Alvord Jr EC, Murray JD. Virtual brain tumours (gliomas) enhance the reality of medical imaging and highlight inadequacies of current. Br J Cancer 2002:14–8. [30] Swanson KR, Bridge C, Murray JD, Alvord EC. Virtual and real brain tumors: using mathematical modeling to quantify glioma growth and invasion. J Neurol Sci 2003;216:1–10. [31] Aschke M, Wirtz CR, Raczkowsky J, Wörn H, Kunze S. Augmented reality in operating microscopes for neurosurgical interventions. Int IEEE/EMBS Conf Neural Eng NER 2003;2003–Janua:652–5.

[32] Drouin S, Kochanowska A, Kersten-Oertel M, Gerard IJ, Zelmann R, De Nigris D, et al. IBIS: an OR ready open-source platform for image-guided neurosurgery. Int J Comput Assist Radiol Surg 2017;12:363–78. [33] Schwab B, Hungness E, Barsness K, McGaghie W. The role of simulation in surgical training. J Laparosc Adv Surg Tech 2017;27:169–72. [34] Cobb MIPH, Taekman JM, Zomorodi AR, Gonzalez LF, Turner DA. Simulation in neurosurgery—A brief review and commentary. World Neurosurg 2016;89:583–6. [35] McGuiness L, Rai B. Robotics in urology. Robot Ann R Coll Surg Engl 2016;117:38–44. [36] Ciporen J, Gillham H, Noles M, Dillman D, Baskerville M, Haley C, et al. Crisis management simulation: establishing a dual neurosurgery and anesthesia training experience. J Neurosurg Anesthesiol 2018;30:65–70. [37] Finnerty BM, Afaneh C, Aronova A, Fahey TJ, Zarnegar R. General surgery training and robotics: are residents improving their skills? Surg Endosc Other Interv Tech 2016;30:567–73. [38] Delorme S, Laroche D, Diraddo R, Del Maestro RF. NeuroTouch: a physicsbased virtual simulator for cranial microneurosurgery training. Neurosurgery 2012;71:32–42. [39] Gélinas-Phaneuf N, Choudhury N, Al-Habib AR, Cabral A, Nadeau E, Mora V, et al. Assessing performance in brain tumor resection using a novel virtual reality simulator. Int J Comput Assist Radiol Surg 2014;9:1–9. [40] Choudhury N, Gélinas-Phaneuf N, Delorme S, Del Maestro R. Fundamentals of neurosurgery: virtual reality tasks for training and evaluation of technical skills. World Neurosurg 2013;80:9–19. [41] Alzhrani G, Alotaibi F, Azarnoush H, Winkler-Schwartz A, Sabbagh A, Bajunaid K, et al. Proficiency performance benchmarks for removal of simulated brain tumors using a virtual reality simulator neurotouch. J Surg Educ 2015;72:685–96. [42] Azarnoush H, Alzhrani G, Winkler-Schwartz A, Alotaibi F, Gelinas-Phaneuf N, Pazos V, et al. Neurosurgical virtual reality simulation metrics to assess psychomotor skills during brain tumor resection. Int J Comput Assist Radiol Surg 2015;10:603–18. [43] Clarke DB, D’Arcy RCN, Delorme S, Laroche D, Godin G, Hajra SG, et al. Virtual reality simulator: demonstrated use in neurosurgical oncology. Surg Innov 2013;20:190–7. [44] Ryu HA, Chan S, Sutherland GR. Supplementary educational models in Canadian neurosurgery residency programs. Can J Neurol Sci 2018;44:177–83. [45] Winkler-Schwartz A, Bajunaid K, Mullah MAS, Marwa I, Alotaibi FE, Fares J, et al. Bimanual psychomotor performance in neurosurgical resident applicants assessed using NeuroTouch, a virtual reality simulator. J Surg Educ 2016;73:942–53. [46] Varshney R, Frenkiel S, Nguyen LHP, Young M, Del Maestro R, Zeitouni A, et al. Development of the McGill simulator for endoscopic sinus surgery: a new high-fidelity virtual reality simulator for endoscopic sinus surgery. Am J Rhinol Allergy 2014;28:330–4. [47] Varshney R, Frenkiel S, Nguyen LHP, Young M, Del Maestro R, Zeitouni A, et al. The McGill simulator for endoscopic sinus surgery (MSESS): a validation study. J Otolaryngol Head Neck Surg 2014;43:40. [48] Dharmawardana N, Ruthenbeck G, Woods C, Elmiyeh B, Diment L, Ooi E, et al. Validation of virtual reality based simulations for endoscopic sinus surgery. Clin Otolaryngol 2015;40:569–79. [49] Gasco J, Patel A, Luciano C, Holbrook T, Ortega-Barnett J, Kuo Y-F, et al. A novel virtual reality simulation for hemostasis in a brain surgical cavity: perceived utility for visuomotor skills in current and aspiring neurosurgery residents. World Neurosurg 2013;80:732–7. [50] Kockro RA, Serra L, Tseng-Tsai Y, Chan C, Yih-Yian S, Gim-Guan C, et al. Planning and simulation of neurosurgery in a virtual reality environment. Neurosurgery 2000:118–37. [51] Wong GKC, Zhu CXL, Ahuja AT, Poon WS. Stereoscopic virtual reality simulation for microsurgical excision of cerebral arteriovenous malformation: case illustrations. Surg Neurol 2009;72:69–72. [52] Wong G, Zhu C, Ahuja A, Poon W. Craniotomy and clipping of intracranial aneurysm in a stereoscopic virtual reality environment. Neurosurgery 2007;61:564–9. [53] Davis MC, Can DD, Pindrik J, Rocque BG, Johnston JM. Virtual interactive presence in global surgical education: international collaboration through augmented reality. World Neurosurg 2016;86:103–11. [54] Azarnoush H, Siar S, Sawaya R, Al Zhrani G, Winkler-Schwartz A, Alotaibi FE, et al. The force pyramid: a spatial analysis of force application during virtual reality brain tumor resection. J Neurosurg 2017;127:171–81. [55] Lee GI, Lee MR. Can a virtual reality surgical simulation training provide a selfdriven and mentor-free skills learning? Investigation of the practical influence of the performance metrics from the virtual reality robotic surgery simulator on the skill learning and asso. Surg Endosc Other Interv Tech 2018;32:62–72. [56] Villanueva-Meyer JE, Mabray MC, Cha S. Current clinical brain tumor imaging. Neurosurgery 2017;81:397–415. [57] Mandonnet E, Jbabdi S, Taillandier L, Galanaud D, Benali H, Capelle L, et al. Preoperative estimation of residual volume for WHO grade II glioma resected with intraoperative functional mapping. Neuro Oncol 2007;9:63–9. [58] Volz LJ, Kocher M, Lohmann P, Shah NJ, Fink GR, Galldiks N. Functional magnetic resonance imaging in glioma patients: from clinical applications to future perspectives. Q J Nucl Med Mol Imaging 2018. https://doi.org/10.23736/ S1824-4785.18.03101-1. [59] Dubey A, Kataria R, Sinha VD. Role of diffusion tensor imaging in brain tumor surgery. Asian J Neurosurg 2018;13:302–6.

Please cite this article as: C. Lee and G. K. C. Wong, Virtual reality and augmented reality in the management of intracranial tumors: A review, Journal of Clinical Neuroscience, https://doi.org/10.1016/j.jocn.2018.12.036

C. Lee, G.K.C. Wong / Journal of Clinical Neuroscience xxx (xxxx) xxx [60] Spena G, Schucht P, Seidel K, Rutten G-J, Freyschlag CF, D’Agata F, et al. Brain tumors in eloquent areas: a European multicenter survey of intraoperative mapping techniques, intraoperative seizures occurrence, and antiepileptic drug prophylaxis. Neurosurg Rev 2017;40:287–98. [61] Kersten-Oertel M, Jannin P, Collins DL. The state of the art of visualization in mixed reality image guided surgery. Comput Med Imaging Graph 2013;37:98–112. [62] Kersten-oertel M, Jannin P, Collins DL. DVV: a taxonomy for mixed reality visualization in image guided surgery. IEEE Trans Vis Comput Graph 2012;18:332–52. [63] Kumar A, Maskara S. Coping up with the information overload in the medical profession. J Biosci Med 2015;3:124–7. [64] Davis D, Ciurea I, Flanagan T. Solving the information overload problem: a letter from Canada. 2004, www.agreecollaboration.org (accessed 3 Sep2018). [65] Dewan MC, Rattani A, Fieggen G, Arraez MA, Servadei F, Boop FA, et al. Global neurosurgery: the current capacity and deficit in the provision of essential neurosurgical care. Executive summary of the global neurosurgery initiative at the program in global surgery and social change. J Neurosurg 2018:1–10. [66] Smith JA, Jivraj J, Wong R, Yang V. 30 years of neurosurgical robots: review and trends for manipulators and associated navigational systems. Ann Biomed Eng 2016;44:836–46. [67] Belykh EG, Zhao X, Cavallo C, Bohl MA, Yagmurlu K, Aklinski JL, et al. Laboratory evaluation of a robotic operative microscope – visualization platform for neurosurgery. Cureus 2018. https://doi.org/10.7759/cureus.3072. [68] Madhavan K, Kolcun JPG, Chieng LO, Wang MY. Augmented-reality integrated robotics in neurosurgery: are we there yet? Neurosurg Focus 2017;42:E3. [69] Gorbanev I, Agudelo-Londoño S, González RA, Cortes A, Pomares A, Delgadillo V, et al. A systematic review of serious games in medical education: quality of evidence and pedagogical strategy. Med Educ Online 2018;23:1438718.

7

[70] Yang DL, Xu QW, Che XM, Wu JS, Sun B. Clinical evaluation and follow-up outcome of presurgical plan by Dextroscope: a prospective controlled study in patients with skull base tumors. Surg Neurol 2009;72:682–9. [71] Wang SS, Zhang SM, Jing JJ. Stereoscopic virtual reality models for planning tumor resection in the sellar region. BMC Neurol 2012:12. https://doi.org/ 10.1186/1471-2377-12-146. [72] Qiu TM, Zhang Y, Wu JS, Tang WJ, Zhao Y, Pan ZG, et al. Virtual reality presurgical planning for cerebral gliomas adjacent to motor pathways in an integrated 3-D stereoscopic visualization of structural MRI and DTI tractography. Acta Neurochir (Wien) 2010;152:1847–57. [73] Alotaibi FE, AlZhrani GA, Mullah MAS, Sabbagh AJ, Azarnoush H, WinklerSchwartz A, et al. Assessing bimanual performance in brain tumor resection with NeuroTouch, a virtual reality simulator. Neurosurgery 2015;11:89–98. [74] Sun GC, Wang F, Chen XL, Yu XG, Ma XD, Zhou DB, et al. Impact of virtual and augmented reality based on intraoperative magnetic resonance imaging and functional neuronavigation in glioma surgery involving eloquent areas. World Neurosurg 2016;96:375–82. [75] Yoshino M, Kin T, Nakatomi H, Oyama H, Saito N. Presurgical planning of feeder resection with realistic three-dimensional virtual operation field in patient with cerebellopontine angle meningioma. Acta Neurochir (Wien) 2013;155:1391–9. [76] Abhari K, Baxter JSH, Chen ECS, Khan AR, Peters TM, De Ribaupierre S, et al. Training for planning tumour resection: augmented reality and human factors. IEEE Trans Biomed Eng 2015;62:1466–77. [77] Deng W, Li F, Wang M, Song Z. Easy-to-use augmented reality neuronavigation using a wireless tablet PC. Stereotact Funct Neurosurg 2014;92:17–24. [78] Inoue D, Cho B, Mori M, Kikkawa Y, Amano T, Nakamizo A, et al. Preliminary study on the clinical application of augmented reality neuronavigation. J Neurol Surgery, Part A Cent Eur Neurosurg 2013;74:71–6. [79] Mahvash M, Tabrizi LB. A novel augmented reality system of image projection for image-guided neurosurgery. Acta Neurochir (Wien) 2013;155:943–7.

Please cite this article as: C. Lee and G. K. C. Wong, Virtual reality and augmented reality in the management of intracranial tumors: A review, Journal of Clinical Neuroscience, https://doi.org/10.1016/j.jocn.2018.12.036