570
junior posters / european urology supplements 10 (2011) 567–570
105 Development of an augmented reality system for robotic prostatectomy: Towards reducing the learning curve A.N. Sridhar1 , D.C. Cohen1 , D. Chen2 , P. Pratt2 , B. Khoubehi3 , J.A. Vale3 , G.Z. Yang2 , A.W. Darzi1 , E.K. Mayer1 , E. Edwards2 . 1 Department of Surgery and Cancer, Imperial College London, London, UK; 2 Department of Computing, Imperial College London, London, UK; 3 Dept. of Urology, Imperial College Healthcare NHS Trust, London, United Kingdom Introduction: The advent and widespread acceptance of robotic-assisted laparoscopic prostatectomy (RALP) has necessitated the development of new skill sets for the trainee surgeon and those converting from open or conventional laparoscopic prostatectomy. The loss of haptic feedback during RALP can increase the challenges during training, with the surgeon having to rely solely on visual cues from the operative field. Augmented reality is a tool which can increase the amount of visual information available, in real time, to the surgeon hence guiding surgery and with the potential to reduce the learning curve. A recently conducted study at our institute identified certain steps of RALP that could benefit from AR guidance [1]. This work presents the implementation of an image guidance system that was developed to assist the surgeon in these key stages of surgery. Materials and Methods: CUBE 3T MRI scans of 8 retrospective patients and 5 prospective patients were segmented manually to identify relevant anatomy in the regions of interest (ROI) according to a pre-defined protocol using ITK-SNAP [2]; 3D models of the pelvis were created. Stereo video images of the RALP were recorded for each patient. 6 out of the 9 steps of RALP have been identified where AR guidance could potentially reduce the learning curve. These include • Ligation of dorsal vein complex • Dissection of the bladder neck • Seminal vesicle dissection • Denonvilliers posterior dissection • Nerve-sparing • Mobilisation of the apex of the prostate. At the beginning of the operative procedure, the camera was calibrated using a checkerboard target. The computer generated 3D models were overlaid onto these 6 stages of the procedure retrospectively (recorded video) and in real-time (operative endoscopic view) using a system based on the NVIDIA Quadro Digital Video Pipeline [3]. The registration process was performed using a two-stage, semi-automated scheme [4] which included: • Aligning two corresponding 3D points on the captured video image and the model, • Subsequent adjustment of the remaining rotational degrees of freedom using a rolling ball interface [5]. The landmark used for initial registration in the MRI scan as well as the video was the pelvic brim. Once registered, correspondence was achieved manually. Results: Manual segmentation of the pre-operative MRI scan successfully generated a 3-dimensional anatomical model. Relevant anatomical features, such as bladder, bowel, prostate, peri-prostatic venous plexus, neurovascular bundles and pelvic bones were identified, colour-coded and then successfully overlaid onto the robotic console view of the surgeon. Images of the three stages of segmentation and reconstruction, colourcoding of pelvic structures and then overlay onto the console image will be displayed on the poster. Discussion: Augmented reality image guidance has the potential to improve the learning curve for trainee surgeons. The initial step of 3D model generation and image registration has been achieved as described in this paper. The process is relatively straightforward and can be achieved in real-
time intraoperatively. Our immediate next steps include the assessment of overlay accuracy and development of image tracking in accordance with target tissue movement and deformation that occurs during surgery. Once completed we can then assess and validate for accuracy of image overlay for the entire length of the procedure and finally measure the impact on the learning curve for RALP. Reference(s) [1] Cohen, D., Mayer, E., Chen, D., Anstee, A., Vale, J., Yang, G., Darzi, A., Edwards, P.J. Augmented Reality Image Guidance in Minimally Invasive Prostatectomy. In Prostate Cancer Imaging (2010): 101– 110. [2] www.itksnap.org. Paul A. Yushkevich, Joseph Piven, Heather Cody Hazlett, Rachel Gimpel Smith, Sean Ho, James C. Gee, and Guido Gerig. User-guided 3D active contour segmentation of anatomical structures: Significantly improved efficiency and reliability. Neuroimage 2006 Jul 1; 31(3): 1116–28. [3] http://www.nvidia.com/object/quadro_dvp.html [4] Pratt P., Mayer E. K., Vale J., Cohen D., Edwards E., Darzi A.W., Yang G.Z. Image-Guided Robotic Partial Nephrectomy: Benefits and challenges. In proceedings of The Hamlyn Symposium on Medical Robotics (2011): 85–86. [5] Hanson A. The Rolling Ball. Graphics Gems III. Academic Press Professional, Inc., 1992: 51–60.