Image-Guided Motion Compensation for Robotic-Assisted Beating Heart Surgery

Image-Guided Motion Compensation for Robotic-Assisted Beating Heart Surgery

21 G Image-Guided Motion Compensation for Robotic-Assisted Beating Heart Surgery George Moustris and Costas Tzafestas National Technical University o...

2MB Sizes 0 Downloads 38 Views

21 G

Image-Guided Motion Compensation for Robotic-Assisted Beating Heart Surgery George Moustris and Costas Tzafestas National Technical University of Athens, Athens, Greece

ABSTRACT Motion compensation in coronary artery bypass graft surgery refers to the virtual stabilization of the beating heart, along with the mechanical synchronization of the robotic arms with the pulsating heart surface. The stabilized image of the heart is presented to the surgeon to operate on, while the heart motion is compensated by the robot, and the surgeon essentially operates on a virtual still heart. In this chapter, we present an introduction to the concept of motion compensation and a brief history of research efforts. We analyze a unifying framework which naturally binds together the image stabilization, mechanical synchronization, and shared control tasks. This framework serves as a baseline upon which more complicated assistive modes are built, for example, active and haptic assistance. These modes are discussed more thoroughly, and their efficacy is assessed via laboratory experimental trials in a simulation setup, which are presented in detail. Handbook of Robotic and Image-Guided Surgery. DOI: https://doi.org/10.1016/B978-0-12-814245-5.00021-9 © 2020 Elsevier Inc. All rights reserved.

363

364

21.1

Handbook of Robotic and Image-Guided Surgery

Introduction

Robotic-assisted coronary artery bypass graft (CABG) surgery is a relatively new surgical treatment for patients with coronary disease. The goal is to restore normal blood flow to coronary arteries which present with a blockage. As such, it is the next evolutionary step to the traditional open-chest CABG, where the use of the robot enables totally endoscopic minimally invasive access to the heart. During this operation the heart remains beating but is mechanically stabilized in order for the surgeon to operate on it. This is performed using tools which exert pressure or suction on the heart tissue, known as tissue/cardiac stabilizers. Residual motion is present, however, as well as complications due to stress enforced on the heart, such as hemodynamic instability, lower cardiac output, stroke volume, arterial pressure [1,2], and myocardial ischemia [3]. To overcome these drawbacks, the concept of motion compensation has been proposed [4]. Motion compensation refers to the virtual stabilization of the image of the beating heart, along with the mechanical synchronization of the robotic arms with the pulsating cardiac wall. The stabilized image of the heart is presented to the surgeon to operate on, while the heart motion is compensated by the robot and the surgeon essentially operates on a virtual still heart (Fig. 21.1). This technology obviates the need for cardiac stabilizers and may have positive effects on CABG surgery, such as shorter operating time, better hemodynamic stability, and fewer conversions to open-chest surgery. Image-guided motion compensation in robotic-assisted cardiac surgery consists of three subproblems: image stabilization, mechanical synchronization, and shared control. The first basic tasks are surface reconstruction [5] and motion estimation [6] of the region of interest (ROI), either from stereo imaging or depth cameras. Using this estimation, the image is warped in an appropriate way, in order to algorithmically cancel the motion and the deformation of the ROI. In effect, this transformation virtually stabilizes the ROI, which is then presented to the surgeon. Concurrently, the robotic arms on the patient side manipulator (PSM) follow the actual motion of the ROI in order to move along with it in a synchronized way. In principle, both the position and orientation of the arms must be regulated to track the ROI movement. Since the arms are tele-operated by the surgeon, when he/she holds the master controls still, the slave arms on the ROI must move in sync with it and appear still in the stabilized image. Thus, this mechanical synchronization is a nontrivial task. The final problem is the actual translation of the surgeon’s movements on the master console, to the movement of the PSM. Since the robotic arms are mechanically synchronized with the ROI motion, they are controlled by the robot and the surgeon at the same time, that is, they share the control. This shared control binds the mechanical synchronization and image stabilization algorithms, by allowing the surgeon to operate on the still image and concurrently compensating the cardiac movement. However, this binding is complex since the physical space (i.e., the Cartesian space where the PSM moves) and the stabilized image space are nonlinearly related by a perspective plus a warping transformation. In the following we will present a theoretical unifying framework under which the mechanical synchronization, the image stabilization, and the shared control combine seamlessly. This framework allows the development of more advanced control techniques such as haptic and active assistance, which augment the surgeon’s performance. These applications will be also described, along with experimental results from surgeons.

21.2

Background

In general, existing research falls into two categories: mechanical synchronization methods and image stabilization algorithms. The first attempt to develop a motion compensation scheme for beating heart surgery was presented by Nakamura et al. [4], who introduced the notion of heartbeat synchronization. The authors employed a 4 degrees-offreedom (DoF) robot along with a high-speed camera at 995 fps in order to track a laser point projected on a vibrating piece of paper. The camera image was moved in the image buffer in order to keep the reference point at the same pixel position, whilst the robot moved in sync. This visual synchronization is a simpler form of image stabilization, employing only translations. Similar experiments have been performed using the da Vinci Research Kit with bimanual control [7]. A heart mockup consisting of a piece of dense foam coated in a colored latex coating was moved purely translational using a parallel platform (Novint Falcon). Four markers were mounted onto it and tracked by a 3D position sensor. A camera was also mounted onto a separate manipulator, moving in sync and thus providing visual synchronization. Tests in suturing by surgeons provided encouraging results about the usefulness of the technique. However, the nonrealistic heart model which presents no deformity, along with the simplified motion of the mockup which has small movement variance in the x,y axes, allow for further enhancement and improvement.

Image-Guided Motion Compensation for Robotic-Assisted Beating Heart Surgery Chapter | 21

365

21.3

Image stabilization

Consider the setup shown in Fig. 21.1, showing the PSM and master console. The slave robot manipulates the operating field (e.g., cardiac wall) while an endoscope provides a view of the action. The unaltered image from this camera is called the physical image. In motion compensation, the physical image is stabilized and projected to the master console. The stabilized image is called the canonical image. The surgeon perceives the surgical field in the canonical image as still, and produces the human input which is fed to the shared controller manipulating the slave robot. We define four spaces describing objects in their respective domains, namely: the physical workspace Wp where the PSM operates in; the physical image space Ip; the canonical image space Ic; and the canonical workspace Wc, denoting the Cartesian space of the master controls. Using the pinhole camera model, we can relate the physical image space Ip to the physical workspace Wp by using the standard projective transformation and the camera matrix, that is, P : Wp -Ip ; PAGPLðWp Þ. Following this, the canonical and physical images are related by a warping transformation Ψ which cancels the apparent motion of the operating field, namely, Ψ : Ip -Ic . This warping transformation is responsible for stabilizing the image of the beating heart, and presents it as “virtually still” to the surgeon. From the commutation diagram in Fig. 21.2, we see that if we can find a bijective map Φ : Mc -Mp , which maps a static reference into the surgical field, then the image warping transformation Ψ can be computed as, Ψ 5 P3Φ21 3P21

(21.1)

FIGURE 21.1 Overview of the motion compensation concept. The surgeon views a stabilized image of the pulsating heart, while concurrently the robot is synchronized to the heartbeat. The surgeon and the system share control of the patient side manipulators.

21. Beating Heart Surgery

In Ref. [8] the authors performed in vivo tests on a porcine heart, placing four markers on the heart surface in order to capture the motion with a 500 fps camera. Model predictive control (MPC) algorithms were also used, employing a heartbeat model for reference. MPC is further developed in Refs. [9,10]. Motion prediction is also investigated in Ref. [11], using a least square approach and an artificial neural network. An interesting feature in this work is the ability to predict the motion of visually occluded parts of the heart and the fusion of biological signals (electrocardiographic (ECG) and respiration) in the estimation algorithms. The algorithms, however, were not tested on a real robot. Biological signals in MPC were also utilized in Ref. [12]. A sonomicrometry system was used to collect motion data from the heart, bypassing the problem of visual occlusion of the surgical field by the robotic manipulators or other surgical tools. Three-dimensional ultrasound-guided motion compensation for beating heart mitral repair is presented in Ref. [13]. A 1-DoF linear guide was controlled for an anchoring task using feedback from a 3DUS system. Due to the latency of the process, predictive filters were also employed. A US-guided cardiac catheter utilizing motion compensation has also been presented in Ref. [14]. A different approach is presented in Ref. [15] using feedback from a force sensor mounted on a robot. Assuming that the motion is periodic, the authors used an iterative learning controller with a low-pass filter to cancel the motion. In vitro results showed the potential of this approach; however the assumption of periodicity is an oversimplification of the actual motion of the heart. A more robust approach is presented in Ref. [16] using ECG and respiratory signals in order to model and estimate the full 3D motion of the heart surface. Work on the imaging part of motion compensation involves tracking of features on the heart [17,18] and image stabilization. For the latter, interesting work is presented in Ref. [19], where the image was rectified using the CUDA framework. A different method is presented in Ref. [20], where the heart surface was reconstructed in 3D and produced a virtual model. A virtual camera was then set to track a point on the virtual surface, effectively compensating the specific point.

366

Handbook of Robotic and Image-Guided Surgery

FIGURE 21.2 Definition of the four fundamental spaces for motion compensation.

FIGURE 21.3 Illustration of the effect of the strip-wise affine map.

To find Φ, we must formally define what its actual function will be. To that end, for motion compensation the system must track the motion of a reference manifold Mp ðui ; tÞCWp in the surgical field, parameterized by coordinates ui Aℝ. For example Mp could be a point on the heart surface (0-manifold), a line (1-manifold), or a patch (2-manifold). In this work we will present the tracking of 1-manifolds on the heart surface, that is, a line. This is implemented using the strip-wise affine map [21], which is ultimately identified with the map Φ.

21.4

Strip-wise affine map

The strip-wise affine map (SWAM) is a piecewise linear map between the physical and canonical workspaces. It decomposes the xy plane into strips and then applies an affine map between each one. It takes a polygonal line from the Wp, and maps it to the x-axis in the canonical space Wc. Now, let fwi g; i 5 1; . . .; n,wi 5 ðp xi ; p yi ; 0Þ be the vertices of the polygonal line, lying on the physical plane z 5 0 (Fig. 21.3). Each vertex is projected to a point αi on the real axis in the canonical world according to its normalized length, ai 5

i X

Sk =S;

i 5 1; . . .; n

k51

(21.2)

P where Sk 5 jwk 2 wk21 j; S 5 n1 Sk , and S1 5 0. It holds that a1 5 0 and an 5 1. Furthermore, let qp 5 ðxp ; yp ; zp Þ be a point in Wp and qc 5 ðxc ; yc ; zc Þ a point in Wc. Then the SWAM sends qc to qp using, 2 3 2 3 xp yc S cos θs 1 fx ðxc Þ qp 5 4 yp 5 5 4 yc S sin θs 1 fy ðxc Þ 5 5 Φðqc Þ (21.3) zp zc where θs is some angle, called the shifting angle. The functions fx and fy are given by,

Image-Guided Motion Compensation for Robotic-Assisted Beating Heart Surgery Chapter | 21

n  X p

fx ðxc Þ 5

 xk 1 SUðxc 2 ak Þcos θk ψk ;

k50

n  X p

fy ðxc Þ 5

367



(21.4)

yk 1 SUðxc 2 ak Þsin θk ψk

k50

The angles θk are the angles of each edge [wk wk11] with respect to the physical x-axis, while ψk is an index function on Wc such that,  1; xc A½ak ; ak11 Þ (21.5) ψk ðxc Þ 5 0; elsewhere Observe that Eq. (21.4) is a piecewise linear parameterization of the reference manifold Mp in a way that Mp 5 ðfx ; fy ; 0Þ. The reference line is a 1-manifold and thus xc is the coordinate parametrizing it. When the robot moves on a vertical plane parallel to the x-axis in the canonical space, its image in the physical space moves parallel to the reference manifold, by a standard offset g 5 ðyc S cos θs ; yc S sin θs ; zc Þ from the point ðfx ðxc Þ; fy ðxc Þ; 0Þ which lays on Mp. It follows that we can define the goal point as,  T (21.6) qG ðxc Þ 5 fx ðxc Þ fy ðxc Þ 0 and Eq. (21.3) can take the compact form, The inverse map Φ21 is given by, 2

3

xc 4 yc 5 5 zc

2

sin θs n X S6 62 cos θk ψk J 4 k50 0

2 cos θs n X sin θk ψk k50

0

3 2 C 3 2 a k 2 3 0 7 6J 7 6 7 xp 7 6 4 yp 5 2 6 07 D 7 5 7 6 2 zp 4 J 5 1 0

(21.7)

(21.8)

where C5S

n  X p k50

D5S J 5S

n  X

k50 n X 2

p

 xk sin θs 2 p yk cos θs ψk  xk sin θk 2 p yk cos θk ψk

(21.9)

sinðθs 2 θk Þψk

k50

Using the SWAM, the control of the robot is transferred to the canonical world where the objective for the surgeon is to track the x-axis.

21.5

Shared control

Shared control refers to the simultaneous actuation of the robotic manipulators by the human surgeon and the robotic system, that is, they share the control. This sharing can be something as simple as controlling different degrees of freedom, or something more intricate such as the nonlinear combination of the computer and surgeon commands. Within this framework, a first level of shared control is performed by the map Φ. When the user inputs commands in the canonical space, Φ maps them in the physical space so as to move in sync with the surgical field motion. However, on top of this simple motion compensation, more elaborate assistive schemes can be built. To consider this, let p qm be the position of the master console in the physical space, also here called the master workspace Wm. This is mapped to c qm in the canonical space via, for example, scaling, tremor reduction, etc. The position of the PSM in the physical space is p qs . The PSM tracks a reference position in Wp denoted as p qr , which is mapped to the canonical space to point c qr . Depending on the relation of these points, we discern the following cases.

21. Beating Heart Surgery

qp 5 gðyc ; zc Þ 1 qG ðxc Þ

368

Handbook of Robotic and Image-Guided Surgery

21.5.1

Simple motion compensation

In the simple motion compensation case [22], the canonical master point c qm is identical to the canonical reference c qr , that is, c qm 5 c qr (Fig. 21.4). This resembles a direct teleoperation scheme, albeit intertwined with motion compensation. In the PSM side, the slave manipulator is tracking the physical reference p qr 5 Φðc qr Þ using a controller, for example, a proportionalintegralderivative (PID) control. Thus the slave position p qs follows p qr .

21.5.2

Active assistance

In the active assistance case [23], the computer actively controls the direction of motion of the PSM, while the surgeon operates some other DoF. In our particular case, for the tracking of the reference line in the physical space, the goal is to track the stabilized straight line in the canonical space. To this end, the canonical master point c qm and the canonical reference c qr are dissociated, that is, c qm 6¼ c qr (Fig. 21.5). Given the canonical master point c qm , let c q0 be the projected point on the xc axis. Then, the computer controls the canonical reference c qr only on the yc axis, either moving it toward c qm or toward c q0 . When the canonical master enters the attraction zone, with width ε about the xc axis, the reference is attracted toward the projection point c q0 . When the master leaves the zone, the reference moves toward it, resulting in pure teleoperation. What this behavior does, is that it “snaps” the slave manipulator to the reference line, when it is very close (inside the attraction zone). When this happens the surgeon controls the lateral movement of the reference, while it remains snapped onto the line. When the canonical master exits the assistive zone, it reverts back to normal motion. As is apparent, in this assistive mode the computer actively controls the slave manipulator. In that sense, the slave position does not necessarily follow the master, especially when transitioning between the two attracting points.

21.5.3

Haptic assistance

In the haptic assistance the computer exerts assistive forces on the master console, in order to guide the surgeon toward the proper direction. This is juxtaposed with the active assistance case where the force is applied to the slave side.

FIGURE 21.4 Illustration of the simple motion compensation scheme. The input from the master console is directly translated into the physical space in the operating field, intertwined with motion compensation.

FIGURE 21.5 Illustration of the active assistance scheme. The canonical reference point c qr is attracted toward the canonical master point c qm or its projection on the xc axis c q0 , depending on whether it is inside the assistance zone.

Image-Guided Motion Compensation for Robotic-Assisted Beating Heart Surgery Chapter | 21

369

FIGURE 21.6 Illustration of the haptic assistance scheme. The canonical reference point c qr is identified to the canonical master point c qm , which is attracted by its projection on the xc axis c q0 , depending on whether it is inside the assistance zone.

The canonical master point c qm is again identified to the canonical reference c qr , that is, c qm 5 c qr (Fig. 21.6). If c q0 is the projected point on the xc axis, then the force is exerted on the master, based on its distance to the projected point. This defines a guidance virtual fixture (VF)/active constraint [24] around the xc axis. The fixture is defined in a cylinder, with external radius ε, placed at the xc axis. Two more internal cylinders are also defined. One with radius β , ε and a second with radius δ , β (Fig. 21.7, left). These areas define a force profile that starts linearly from ε to β, exerts a constant force between δ and β, falling back linearly to zero (Fig. 21.7, right). The vectorial force lies on the zcyc plane, pointing toward xc. The guidance fixture has been contained in the cylinder in order to prevent an unwanted drag effect in the entire surgical field, which would counter the surgeon’s movements in different areas. Looking closer we see that in the inner cylinder ðjrj , δÞ, the force is linearly reduced to zero. This has been implemented so as to allow the surgeon to perform the surgical task when on the reference path. In a different situation, for example, if the force was present at r 5 0, normal positional errors of the surgeon due to tremor or the inherent limited accuracy of humans, would result in switching forces being applied, as the tip would “wiggle” about the axis. This would present an undesirable effect since the fixture would be too stiff, not allowing the surgeon to perform motion in the force direction. Thus a spring was inserted, in order to present haptic cues to the user and not confine him/her in the inner tube.

21.6 21.6.1

Experimental setup Robotic system description

The robotic teleoperation masterslave system consists of the PHANToM Omni (now called the Touch haptic device from 3D Systems), for the master side, and the PHANToM Desktop for the slave side (now called Touch X from 3D Systems). These two robotic devices have identical mechanical structure (six-DoF positional sensing, anthropomorphic serial link manipulators), with the first three joints being actuated by DC motors. The tool center point (TCP) is located at the intersection of the last three joints that form a spherical wrist, that is, a gimbal. Since we are only compensating

21. Beating Heart Surgery

FIGURE 21.7 (Left) Depiction of the 3D virtual fixture around the canonical axis. The fixture is of cylindrical shape with two more internal regions regulating the force. (Right) The force profile in relation of the distance to the axis.

370

Handbook of Robotic and Image-Guided Surgery

FIGURE 21.8 (Left) Depiction of the simulated master robot, operated by the surgeon in the canonical space. (Right) Depiction of the simulated slave robot, operating in the physical space. Notice the needle attached to the gimbal, simulating a lancet. The checkerboards are used for visual calibration and registration.

FIGURE 21.9 Overview of the experimental setup.

for the translational movements of the surgical field, utilizing only the first three joints, the gimbal was rigidly locked to the second link at each robot (Fig. 21.8). The master robot is a haptic display device, able to render forces on the TCP and affect the surgeon’s movements per the assistive modes. The kinematic equations of the robots were derived using their known DenavitHartenberg parameters [22]. The forward kinematics produces the Cartesian position of the TCP with respect to the base frame. In order to simulate a lancet on the slave robot, a metallic needle was attached to the gimbal stylus, matching a similar one on the master side, which is provided by default. The kinematics was extended to include the transformation from the base frame to the tip, using a calibration algorithm employing a checkerboard as a reference. The checkerboard’s vertices were sampled with each robot’s tip and the problem was reduced to a quadratic minimization problem. The checkerboard also defined the physical world frame Wp. To express the tip’s coordinates in Wp, the robot base frame was registered to the world frame using [25]. A similar procedure was also used for the master robot to register its base frame to a checkerboard defining the canonical world frame Wc. The surgical field is a video of a beating heart, projected at a semitransparent screen lying in front of the slave robot. Underneath, there is the projector showing the reference line on the screen. The surgeon views this field through a camera mounted on a pole, aiming toward the projection screen. The camera was calibrated and registered to Wp using the Camera Calibration Toolbox in MATLAB. The general overview of the system is shown in Fig. 21.9. The master and slave robots are connected to two different computers (the master and slave controllers, respectively) while the camera is feeding a video stream to a third computer (user console). The computers communicate over a Gigabit Ethernet connection using the User Datagram Protocol (UDP). This configuration was chosen in order to reduce the computational overhead and latency since the implemented algorithms need to be real time.

21.6.2

Graphics system description

The graphics system is responsible for the image acquisition and rectification on the master console. It comprises a camera and a dedicated PC to perform the image processing. The camera model is a LifeCam VX-800 USB 2.0 camera

Image-Guided Motion Compensation for Robotic-Assisted Beating Heart Surgery Chapter | 21

371

from Microsoft. The PC uses the camera to acquire the image, processes it, and presents it to the surgeon as a stabilized image. Image rectification is implemented in MATLAB and OpenCV, achieving a refresh rate of approximately 30 Hz using three parallel processing threads. To further reduce the computational load, as well as the latency in the system, the image resolution was reduced to 320 3 240 pixels and converted to grayscale, following the detection of the red reference line using a color filter. The graphics processing loop essentially applies the Ψ transform on the physical image. The slave robot is controlled by the servo loop, which combines input from the graphics system, namely the detected reference line, as well as the master and slave robots. It controls the slave robot by querying its configuration and using the PID controller for teleoperation, setting the forces on the first three joints. The loop runs at a 1-kHz update rate, meaning that the forces are updated every 1 ms. The position update rate however, follows the update rate of the communication loop (100 Hz). This is utilized using UDP sockets at a 100-Hz update rate. Since the UDP protocol does not employ an error-correcting mechanism for the data packets, data transmission is fast, relatively immune to latency, but unreliable. To compensate for this, a simple error correction algorithm was implemented, using a predefined header with each data packet.

21.7

Simulation experiments

FIGURE 21.10 Depiction of the simulated surgical field. The projected image of the heart is pulsating along with the reference red line, to convey the impression of a beating heart.

21. Beating Heart Surgery

This section describes experiments conducted, along with an analysis of the results, for the two assistive modes, namely the active and the haptic assistance. These two sessions have been performed independently and at different times, with a different user. A trained surgeon was asked each time to operate the master console, in order to track a pulsating red line embedded into an endoscopic image of the heart. The entire image was deformed according to the reference line in order to give the perception of a beating heart. The line followed a periodic movement, driven by a prerecorded ECG signal set to various heart rates. Field trials showed that a rate of 18 bpm was the maximum that allowed for a sufficiently small delay in the image acquisition and control loop, mainly due to the hardware limitations that affect the graphics system update rate in the current experimental setup. To capture the robot motion, a green marker was attached to the tip of the simulated lancet. All trials were recorded in 1080 pixel resolution by an HD camera which was overlooking the scene. Prior to the experiments both the surgeon’s camera and the HD camera were calibrated and registered to a common physical world frame (Fig. 21.10). For the active assistance case, two groups of experiments were performed; the first corresponding to a 12 bpm pulsation frequency and the second to 15 bpm. The first group comprised four cases: “without compensation,” “simple compensation,” “active compensation w/o dead zone (ε 5 15 mm, δ 5 0),” and “active compensation with dead zone (ε 5 15 mm, δ 5 7.5 mm).” The second group included two cases: “active compensation with dead zone” and “no compensation.” The physical and canonical images from the actual experiments are seen in Fig. 21.11. In the “no compensation” case, the surgeon was shown the left (physical) image, while in the compensated ones the right (canonical) image was shown.

372

Handbook of Robotic and Image-Guided Surgery

FIGURE 21.11 View of the simulated surgical field through the camera. (Left) Physical image. (Right) Canonical image. Note the two white squares on the distal ends of the line. The surgeon tracks the line, doing touch-and-go between these two squares in a “ping-pong” like fashion.

TABLE 21.1 Statistical results for the first group in the active assistance session.

Mean error (mm) a

Rel. difference

Assistance with dead zone

Assistance w/o dead zone

Simple compensation

No compensation

4.487

5.131

6.031

7.171

2 12.57%

2 14.91%

2 15.90%



a

Relative difference between consecutive columns.

TABLE 21.2 Aggregate results for the two groups in the active assistance session. 12 bpm

15 bpm

Assistance with dead zone

No compensation

Assistance with dead zone

No compensation

Mean (mm)

4.487

7.171

4.400

8.022

Rel. diff.

2 37.43%



2 45.15%



The surgeon was asked to follow the red line, touching in turn the white squares on the distal ends of the line. The movement of the tip of the slave was calculated in postprocessing from the HD camera video. A simple color filter was used in each frame to track the tip’s green marker, and the closest segment of the red line. Following this, the pixel trajectory was filtered to reduce noise and backprojected to the Cartesian space to perform the analysis. To assess the effect of the active compensation, the error distance of the tip to the line was calculated for each frame and a statistical analysis was performed. The results are presented in Table 21.1. From Table 21.1, we see that the active assistance with dead zone reduces the mean error, with respect to active assistance without dead zone, by 12.57%, which further reduces the error by 14.91% compared to simple compensation, which finally reduces the error by 15.90% if no compensation is used. The effect of the heart rate on the active assistance case is investigated in Table 21.2, showing aggregate data for the two groups. Table 21.2 shows a consistent decrease of the mean error across the two rates. It is worth noting that in the active assistance, the error remains virtually the same, implying that the assistive controller is robust with respect to the pulsation frequency, and is approaching its lower threshold, that is, it effectively cancels the effects of motion irrespective of the frequency, given the current hardware implementation. The latency of this setup can also be identified as the cause of the residual error (B4.5 mm). In the haptic assistance session, the same tracking task was asked by a trained surgeon. The session consisted of three groups (no compensation, simple compensation, compensation with VF). Each group, in turn, comprised three frequency cases according to heart rate (12, 15, 18 bmp). For each frequency, four experimental trials were performed. Consequently, the total number of trials was 36. The trial selection mechanism was fully randomized using a discrete

Image-Guided Motion Compensation for Robotic-Assisted Beating Heart Surgery Chapter | 21

373

TABLE 21.3 Statistical results for each group per heart rate in the haptic assistance session. 12 bpm

15 bpm

18 bmp

Group

Mean (mm)

5.743

5.987

5.496

No compensation

Mean (mm)

4.042

4.544

5.020

Simple compensation

Mean (mm)

3.700

3.850

4.302

Compensation with VF

VF, Virtual fixture.

TABLE 21.4 Aggregate results for the three groups in the haptic assistance session.

Mean (mm) Rel. diff.

a

Std.

Simple compensation

Compensation with VF

4.535

3.951



2 21.01%

2 12.88%

0.733

0.623

0.483



2 15.00%

2 22.50%

VF, Virtual fixture. a Relative difference between consecutive means. b Relative difference between consecutive standard deviations.

uniform distribution. Each trial was removed from the pool in subsequent runs. The parameters of the VF were set to ε 5 40 mm, β 5 30 mm, and δ 5 5 mm, while the maximum force was Fmax 5 1 N. Again, the error distance of the tip to the line was calculated for each frame and a statistical analysis was performed. A summary of the statistical results of the groups, according to heart rate, is presented in Table 21.3. Table 21.3 shows an increase in the average error across heart rates, for the two compensation groups. This can be attributed to the specific hardware setup and algorithmic implementation, which introduces delays in the processing loop. It was experimentally confirmed that frequencies beyond 20 bmp were not capable of being processed in real time by our hardware, injecting significant latency in the control loop. Despite this, the haptic assistance group presents a systematic decrease of the mean error across all three heart rates, compared to the two other groups. The aggregate results of the three groups are presented in Table 21.4. We see a decrease in the average error for the haptic assistance group. Quantitatively, VFs decrease the average error by 12.88% compared to the simple compensation group and by 31.19% compared to no compensation. Furthermore, the standard deviation is also decreased by 22.5% with respect to the simple compensation groups, and 34.1% with respect to no compensation. Interpreting these results, one can say that the virtual fixture allows the surgeon to track the reference line with better accuracy and smaller perturbations, in a consistent manner. Note that by comparing the 15 bmp group across the two sessions (Tables 21.2 and 21.3), we see that the haptic assistance mode enables better tracking than the active assistance with dead zone. These results show the promise of this approach and seem to support the hypothesis that haptic assistance presents advantages for the surgeon in robotic surgery.

21.8

Conclusion

In this chapter we have presented a unifying framework for image-guided motion compensation for robotic-assisted beating heart surgery. This framework binds naturally the image stabilization, mechanical synchronization, and shared control tasks, which are required in order to create a robust motion compensation service that will assist the surgeon in off-pump CABG surgery. More complex assistive modes, like active and haptic assistance, are also presented. These modes are built upon the motion compensation framework and experimental results show that they have a positive effect on the surgeon’s accuracy while tracking features on the beating cardiac wall. Even though this technology is in its first steps, when it reaches maturity it is expected to have a significant impact and change the way robotic CABG will be performed.

21. Beating Heart Surgery

Rel. diff.

b

No compensation 5.742

374

Handbook of Robotic and Image-Guided Surgery

References [1] Oliveira PP, Braile DM, Vieira RW, Petrucci Junior O, Silveira Filho LM, Vilarinho KA, et al. Hemodynamic disorders related to beating heart surgery using cardiac stabilizers: experimental study. Rev Bras Cir Cardiovasc 2007;22(4):40715. [2] Couture P, Denault A, Limoges P, Sheridan P, Babin D, Cartier R. Mechanisms of hemodynamic changes during off-pump coronary artery bypass surgery. Can J Anaesth J Can Anesth 2002;49(8):83549. [3] Raut MS, Maheshwari A, Dubey S. Sudden hemodynamic instability during off-pump coronary artery bypass grafting surgery: role of BezoldJarisch reflex. J Cardiothorac Vasc Anesth 2017;31(6):213940. [4] Nakamura Y, Kishi K, Kawakami H. Heartbeat synchronization for robotic cardiac surgery. In: Proceedings 2001 ICRA IEEE international conference on robotics and automation (Cat No01CH37164), vol. 2; 2001. p. 20149. [5] Hu M, Penney GP, Rueckert D, Edwards PJ, Bello F, Casula R, et al. Non-rigid reconstruction of the beating heart surface for minimally invasive cardiac surgery. Medical image computing and computer-assisted intervention  MICCAI 2009 [Internet]. Berlin, Heidelberg: Springer; 2009 [cited 2018 Feb 15]. p. 3442. (Lecture Notes in Computer Science). [6] Mohamadipanah H, Andalibi M, Hoberock L. Robust automatic feature tracking on beating human hearts for minimally invasive CABG surgery. J Med Dev 2016;10(4):0410100410108. [7] Ruszkowski A, Schneider C, Mohareri O, Salcudean S. Bimanual teleoperation with heart motion compensation on the da Vinci #x00AE; Research Kit: implementation and preliminary experiments. In: 2016 IEEE international conference on robotics and automation (ICRA); 2016. p. 41018. [8] Ginhoux R, Gangloff JA, de Mathelin MF, Soler L, Sanchez MMA, Marescaux J. Beating heart tracking in robotic surgery using 500 Hz visual servoing, model predictive control and an adaptive observer. In: Robotics and automation, 2004 proceedings ICRA ’04 2004 IEEE international conference on, vol. 1; 2004. p. 274279. [9] Gangloff J, Ginhoux R, de Mathelin M, Soler L, Marescaux J. Model predictive control for compensation of cyclic organ motions in teleoperated laparoscopic surgery. IEEE Trans Control Syst Technol 2006;14(2):23546. [10] Ginhoux R, Gangloff J, de Mathelin M, Soler L, Sanchez MMA, Marescaux J. Active filtering of physiological motion in robotized surgery using predictive control. IEEE Trans Robot 2005;21(1):6779. [11] Ortmaier T, Groger M, Boehm DH, Falk V, Hirzinger G. Motion estimation in beating heart surgery. IEEE Trans Biomed Eng 2005;52 (10):172940. [12] Bebek O, Cavusoglu MC. Predictive control algorithms using biological signals for active relative motion canceling in robotic assisted heart surgery. In: Robotics and automation, 2006 ICRA 2006 proceedings 2006 IEEE international conference on; 2006. p. 23744. [13] Yuen S, Kesner S, Vasilyev N, Del Nido P, Howe R. 3D ultrasound-guided motion compensation system for beating heart mitral valve repair. Medical image computing and computer-assisted intervention  MICCAI 2008. New York: Springer Berlin/Heidelberg; 2008. p. 7119. [14] Kesner SB, Howe RD. Design and control of motion compensation cardiac catheters. In: Robotics and automation (ICRA), 2010 IEEE international conference on; 2010. p. 105965. [15] Cagneau B, Zemiti N, Bellot D, Morel G. Physiological motion compensation in robotized surgery using force feedback control. In: Robotics and automation, 2007 IEEE international conference on; 2007. pp. 18816. [16] Duindam V, Sastry S. Geometric motion estimation and control for robotic-assisted beating-heart surgery. In: Intelligent robots and systems, 2007 IROS 2007 IEEE/RSJ international conference on; 2007. pp. 8716. [17] Mountney P, Yang G-Z. Soft tissue tracking for minimally invasive surgery: learning local deformation online. In: Metaxas D, Axel L, Fichtinger G, Sze´kely G, editors. Medical image computing and computer-assisted intervention  MICCAI 2008 [Internet]. Berlin, Heidelberg: Springer; 2008. p. 36472. Lecture Notes in Computer Science; vol. 5242. [18] Stoyanov D, Mylonas GP, Deligianni F, Darzi A, Yang GZ. Soft-tissue motion tracking and structure estimation for robotic assisted MIS procedures. In: Duncan J, Gerig G, editors. Medical image computing and computer-assisted intervention  MICCAI 2005 [Internet], vol. 3750. Berlin, Heidelberg: Springer; 2005. p. 13946. Lecture Notes in Computer Science. [19] Richa R, Bo´ APL, Poignet P. Towards robust 3D visual tracking for motion compensation in beating heart surgery. Med Image Anal 2011;15 (3):30215. [20] Stoyanov D, Yang G-Z. Stabilization of image motion for robotic assisted beating heart surgery. In: Ayache N, Ourselin S, Maeder A, editors. Medical image computing and computer-assisted intervention  MICCAI 2007 [Internet]. Berlin, Heidelberg: Springer; 2007. p. 41724. Lecture Notes in Computer Science; vol. 4791. [21] Moustris G, Tzafestas SG. Reducing a class of polygonal path tracking to straight line tracking via nonlinear strip-wise affine transformation. Math Comput Simul 2008;79(2):13348. [22] Moustris GP, Mantelos AI, Tzafestas CS. Shared control for motion compensation in robotic beating heart surgery. In: 2013 IEEE international conference on robotics and automation (ICRA). Karlsruhe, Germany: IEEE; 2013. p. 581924. [23] Moustris GP, Mantelos AI, Tzafestas C. Active motion compensation in robotic cardiac surgery. In: Proceedings of the European Control Conference 2013. Zurich, Switzerland; 2013. [24] Bowyer SA, Davies BL, Baena FRy. Active constraints/virtual fixtures: a survey. IEEE Trans Robot 2014;30(1):13857. [25] Horn BKP. Closed-form solution of absolute orientation using unit quaternions. J Opt Soc Am A 1987;4(4):62942.