Elastic registration of prostate MR images based on estimation of deformation states

Elastic registration of prostate MR images based on estimation of deformation states

Medical Image Analysis 21 (2015) 87–103 Contents lists available at ScienceDirect Medical Image Analysis journal homepage: www.elsevier.com/locate/m...

3MB Sizes 0 Downloads 13 Views

Medical Image Analysis 21 (2015) 87–103

Contents lists available at ScienceDirect

Medical Image Analysis journal homepage: www.elsevier.com/locate/media

Elastic registration of prostate MR images based on estimation of deformation states Bahram Marami a,b,⇑, Shahin Sirouspour c, Suha Ghoul a,d, Jeremy Cepek a,b, Sean R.H. Davidson e, David W. Capson f, John Trachtenberg g, Aaron Fenster a,b,h a

Imaging Research Laboratories, Robarts Research Institute, London, Ontario, Canada Biomedical Engineering Graduate Program, The University of Western Ontario, London, Ontario, Canada Department of Electrical and Computer Engineering, McMaster University, Hamilton, Ontario, Canada d Department of Medical Imaging, London Health Sciences Center, London, Ontario, Canada e Ontario Cancer Institute, University Health Network, Toronto, Ontario, Canada f Department of Electrical and Computer Engineering, University of Victoria, Victoria, British Columbia, Canada g Department of Surgical Oncology, University Health Network, Toronto, Ontario, Canada h Department of Medical Biophysics, The University of Western Ontario, London, Ontario, Canada b c

a r t i c l e

i n f o

Article history: Received 14 February 2014 Received in revised form 15 December 2014 Accepted 19 December 2014 Available online 8 January 2015 Keywords: Prostate MR image registration Elastic model-based registration State estimation Finite element method Focal ablation therapy

a b s t r a c t Magnetic resonance imaging (MRI) is being used increasingly for image-guided targeted biopsy and focal therapy of prostate cancer. In this paper, a combined rigid and deformable registration technique is proposed to register pre-treatment diagnostic 3 T magnetic resonance (MR) images of the prostate, with the identified target tumor(s), to intra-treatment 1.5 T MR images. The pre-treatment T2-weighted MR images were acquired with patients in a supine position using an endorectal coil in a 3 T scanner, while the intra-treatment T2-weighted MR images were acquired in a 1.5 T scanner before insertion of the needle with patients in the semi-lithotomy position. Both the rigid and deformable registration algorithms employ an intensity-based distance metric defined based on the modality independent neighborhood descriptors (MIND) between images. The optimization routine for estimating the rigid transformation parameters is initialized using four pairs of manually selected approximate corresponding points on the boundaries of the prostate. In this paper, the problem of deformable image registration is approached from the perspective of state estimation for dynamical systems. The registration algorithm employs a rather generic dynamic linear elastic model of the tissue deformation discretized by the finite element method (FEM). We use the model in a classical state estimation framework to estimate the deformation of the prostate based on the distance metric between pre- and intra-treatment images. Our deformable registration results using 17 sets of prostate MR images showed that the proposed method yielded a target registration error (TRE) of 1.87 ± 0.94 mm, 2.03 ± 0.94 mm, and 1.70 ± 0.93 mm for the whole gland (WG), central gland (CG), and peripheral zone (PZ), respectively, using 76 manually-identified fiducial points. This was an improvement over the 2.67 ± 1.31 mm, 2.95 ± 1.43 mm, and 2.34 ± 1.11 mm, respectively for the WG, CG, and PZ after rigid registration alone. Dice similarity coefficients (DSC) in the WG, CG and PZ were 88.2 ± 5.3, 85.6 ± 7.6 and 68.7 ± 6.9 percent, respectively. Furthermore, the mean absolute distances (MAD) between surfaces was 1.26 ± 0.56 mm and 1.27 ± 0.55 mm in the WG and CG, after deformable registration. These results indicate that the proposed registration technique has sufficient accuracy for localizing prostate tumors in MRI-guided targeted biopsy or focal therapy of clinically localized prostate cancer. Ó 2014 Elsevier B.V. All rights reserved.

1. Introduction ⇑ Corresponding author at: 100 Perth Drive, P.O. Box 5015, London, Ontario N6A 5K8, Canada. Tel.: +1 519 663 3833; fax: +1 519 663 3900. E-mail addresses: [email protected] (B. Marami), [email protected]. ca (S. Sirouspour), [email protected] (S. Ghoul), [email protected] (J. Cepek), [email protected] (S.R.H. Davidson), [email protected] (D.W. Capson), [email protected] (J. Trachtenberg), [email protected] (A. Fenster). http://dx.doi.org/10.1016/j.media.2014.12.007 1361-8415/Ó 2014 Elsevier B.V. All rights reserved.

The most common interventions employed for treating clinically localized prostate cancer include radical prostatectomy, hormonal therapy, chemotherapy, external beam radiation therapy and interstitial radiation delivery (brachytherapy). In addition, focal therapies such as focal laser ablation therapy are currently

88

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

under investigation, and may see clinical acceptance in the future. Depending on the treatment strategy used, treatment-related side effects may result including genitourinary, sexual and gastrointestinal dysfunctions. Although definitive cancer control is the main goal in prostate cancer therapy, the physiological functions of surrounding critical structures and organs must be preserved to maintain a normal quality of life for the patient after treatment. Given the close proximity of these structures to the prostate, minimizing treatment-related side effects requires accurate delivery of the treatment. Transrectal ultrasound (TRUS) and magnetic resonance imaging (MRI) systems are widely employed in image-guided prostate interventions. Due to the superior performance in differentiating soft tissue structures, MR has become the modality of choice for localizing prostate cancer (Kozlowski et al., 2010). High-field MRI systems with an endorectal coil can provide detailed images with a high signal-to-noise ratio (SNR). In a typical diagnostic imaging session, co-aligned multi-parametric (MP) scans such as T1 and T2-weighted imaging, diffusion weighted imaging (DWI), dynamic contrast-enhanced (DCE) imaging, MR spectroscopy (MRS) and diffusion tensor imaging (DTI) are used together to visualize substructures of the prostate, i.e., CG and PZ, and their margins (Tan et al., 2012). These images also allow accurate detection and localization of tumors (Tan et al., 2012) for use in guiding biopsy needles (Hata et al., 2001) and focal ablation techniques (Lindner et al., 2009, 2010; Raz et al., 2010). Focal therapy is used for the treatment of clinically localized prostate cancer. Using this treatment technique, a small region of the prostate containing cancerous tissue is destroyed using heat or freezing. In focal laser ablation therapy, an open ended or translucent catheter is inserted into the prostate through the patient’s perineum. Then, an optical fiber is inserted into the prostate through the catheter to ablate the tumor site by laser-induced heat diffusion (Lindner et al., 2010). MRI-guided focal ablation therapy of the prostate provides therapists with the ability of thermal monitoring and damage estimation during the ablation using MR thermometry (Raz et al., 2010). It also allows monitoring the growth of the ablated region intra-operatively and repeating the ablation if necessary. A focal laser ablation procedure performed within the bore of an MR scanner is time-sensitive, requiring acquisition and processing of the images in a short time. These procedures are usually performed in 1.5 T clinical closed-bore MR scanners (Lindner et al., 2010; Raz et al., 2010; Oto et al., 2013). Thus, the intra-treatment images acquired by interventional MRI systems may have lower SNR, spatial resolution and contrast than those of diagnostic images, which are often acquired with 3 T scanners. In these cases, registration of high-quality pre-treatment images to intra-treatment images is necessary to incorporate information from the diagnostic scans into the procedure and update pre-treatment therapy plans. Accurate and precise registration of the diagnostic pre-treatment images with delineated tumor(s) to intratreatment images can potentially improve the accuracy of focal ablation and reduce the duration of the intervention in MRI-guided focal therapy of prostate cancer. Prostate tissue may undergo nonrigid deformation between imaging sessions. It may be caused by the presence and absence of the endorectal coil, different inflation volumes of the endorectal coil balloon, varying bladder and rectal filling, and patient position during imaging sessions. Therefore, deformable registration methods are required to align pre-treatment and intra-treatment images. A novel 3D-3D deformable registration method based on the concept of state estimation for dynamical systems is proposed in this paper and is evaluated on the registration of pre- and intratreatment prostate MR images. The registration method employs a general linear elastic finite element method (FEM)-based deformation model. The registration is achieved through a Kalman-like

filtering process, which incorporates information from the deformation model and an observation error computed from an intensity-based distance metric. The sum of squared distances (SSD) between modality independent neighborhood descriptors (MINDs) computed at sum control points within the prostate volume is employed to compare two images and compute an error vector based on which the deformation states of the prostate are estimated. Previously, Bharatha et al. (Bharatha et al., 2001) proposed a deformable registration method using biomechanical finite element (FE) modeling to register pre-treatment 1.5 T MR images of the prostate to its intra-treatment 0.5 T images in MRI-guided brachytherapy. They constructed a 3D FE model of the prostate from segmented pre-treatment images and considered different material properties for the central gland and the peripheral zone allowing an inhomogeneous linear elastic deformation model for the registration. The volume of the FE mesh was deformed based on deformation extracted from matching a pre-treatment boundary surface to the corresponding boundaries of the segmented prostate in intra-treatment images. Although the proposed deformable registration method significantly improved the quality of matching compared to rigid registration, it required manual segmentation of the images, which prolongs the registration process considerably. A deformable image registration method based on FE modeling was also presented in (Alterovitz et al., 2006) to register two-dimensional (2D) prostate MR images in the presence and absence of the endorectal probe in radiation therapy treatment. The method estimates unknown stiffness parameters of the model and applies forces using a nonlinear local optimization approach given the segmented prostate in probe-in and probeout images. This method also required identification of the prostate boundaries and was sensitive to the image segmentation results. Brock et al. (2005) proposed a FE model-based multi-organ deformable image registration method, called MORFEUS, in which deformable alignment was achieved by explicitly defining the deformation of a subset of organs and assigning surface interfaces between organs. In this method, a node to surface algorithm was employed to determine the boundary conditions in the multiorgan FE analysis. This method was applied in (Hensel et al., 2007) for the registration of endorectal coil MR images of the prostate to CT images in radiotherapy planing. The accuracy and the sensitivity of the contour variation and model size in the registration method were also evaluated in (Brock et al., 2008). The calculation of the boundary conditions using organs contours, constructing the deformation model based on the segmented images, and correct assignment of material properties to all organs are limitations of the proposed method. Furthermore, Oguro et al. (2009) carried out a nonrigid registration of pre-treatment 1.5 T MR images of the prostate to intratreatment 0.5 T MR images in 16 cases. They employed the BasisSpline registration method proposed in (Rueckert et al., 1999) using mutual information (MI) as an intensity-based similarity metric and reported significant improvement in statistical metrics in comparison to only using rigid registration. A 3D elastic registration method was also presented in (Zhang et al., 2011), which minimized the strain energy function of the deformation model based on matching similar features (points, curves, or surfaces) in the pre- and intra-treatment prostate MR images. More recently, a statistical model-based technique was proposed in (Tahmasebi et al., 2012) for accounting for the deformation of the prostate in endorectal coil MR imaging. The model was generated based on mapping MR images of the prostate with and without an endorectal coil. Then, the model was employed for nonlinear registration between endorectal coil MRI and computed tomography (CT) images of the prostate. Both above mentioned registration methods involved image segmentation and feature extraction,

89

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

which required tedious manual intervention and a long processing time. The proposed registration method in this paper has a number of key differences from existing model-based registration methods. First, no specific geometrical information of the prostate and its substructures is used for constructing the deformation model. Instead, a cubic volume of tetrahedral finite elements is employed for the registration. In the state estimation framework, the model can be geometrically and physically inaccurate and still produce acceptable results. This is partly due to the fact that the estimation process takes into account modeling and observation uncertainties in the form of unknown process and measurement disturbances. Second, the proposed registration method requires minimal manual intervention from the user and no pre-processing of the images (e.g., segmentation or feature extraction) is required. Third, the deformation model is generic and can be employed for the registration of different image sets (single and multiple modalities). Finally, instead of a static deformation model, a dynamic model is employed in the registration method. The dynamic model allows for the registration of dynamically deforming tissue by recursively correlating image information over time. In the case of static tissue deformation similar to experiments in this paper, the registration algorithm simply converges to a steady-state solution. Although 3D-3D registration of prostate MR images is considered in this paper, the state estimation-based registration method provides a unified framework in which 3D-3D, 2D-2D and 3D-2D single and multi-modality image registration problems can all be treated with small modification in the registration algorithm. We previously applied earlier version of this algorithm for the 3D-3D and 3D-2D registration of MR and ultrasound (US) images acquired from a breast phantom (Marami et al., 2011a,b, 2014c) and breast tissue (Marami et al., 2014b) with static and dynamic deformations. In this paper, the method is further developed and evaluated in the registration of pre-treatment 3 T T2-weighted MR images to intra-treatment 1.5 T MR images of the prostate. Preliminary results of this work were also reported in (Marami et al., 2014a). The rest of the paper is organized as follows. The mathematical formulation of the proposed state estimation-based registration method is discussed in Section 2. In Section 3, the registration method is validated and registration results are given in terms of TRE, DSC, Recall/Precision and MAD between surfaces. A discussion on the proposed registration method and its involved computations is also given in Section 4. The paper is finally concluded in Section 5.

Furthermore, nonlinear state estimation methods are not widely available as their linear counterparts. Therefore, a rather general linear elastic FE model is employed in the proposed registration method. The deformation model is developed using a cube of tetrahedral finite elements and given homogeneous elastic material properties. State estimation based on linear dynamic models takes into account modeling and imaging uncertainties in the form of an unknown process and measurement disturbances. Hence, the deformation model can be geometrically or physically inaccurate and still produce acceptable results. The dynamics for tissue deformation can be modeled by the following linear state-space equations

xk ¼ Axk1 þ B~f k1 þ wk1

ð1aÞ

zk ¼ Hxk þ vk

ð1bÞ

where xk is the vector of deformation states at time step k, and the input vector ~f represents external forces (loads) applied to the deformable body. A is called the state transition matrix that is applied to the previous states to compute the currents deformation states, and B is the input matrix relating the input forces to the deformation states. Eq. (1a) establishes a relation between the states of the deformation at different sample times and the input forces. H is the output matrix which establishes a relation between the system’s measurement vector zk and the deformation states in Eq. (1b). In image registration, the measurement vector zk could be intensity values of pixel/voxel in the intra-treatment images, the displacement field applied to the image grid points (in comparison with the other image to be registered), or any other form of information/features derived from post-processing of the raw image data. Moreover, the vectors wk and vk are process and measurement noise, and represent uncertainties/errors in the deformation dynamics and imaging model, respectively. Eqs. (1a) and (1b) together provide a state-space model of the tissue deformation and its relation to the image data measurements. Existing state estimation techniques can be employed to obtain the unknown states of the tissue deformation based on the intra-treatment images and comparing them with pre-treatment images. Fig. 1 shows the general flow of the iterative registration technique based on the concept of state estimation for dynamical systems. A combination of modeling and observational information from the system is the underlying principle of most state estimation methods. The current states of the tissue deforma-

Pre-treatment Images

Intra-treatment Images

2. Methods and materials Deformation & Interpolation

uk

2.1. Image registration as state estimation of dynamical systems In this paper, the problem of image registration is approached from the perspective of state estimation for dynamical systems. In the proposed method, the tissue deformation is approximated using an elastic model. Given the deformation model, the deformation of the tissue is estimated based on a matching criterion between pre-treatment and intra-treatment images. Biological tissues in general exhibit a complex deformation behavior, which is best described by nonlinear models. However, instead of using such a model, a simple linear elastic deformation model discretized by FEM is used in this paper for several reasons. First, developing an accurate nonlinear model with heterogeneous materials is difficult, and possibly requires manual intervention of a radiologist and preprocessing of the image data to identify different tissues and their boundary conditions. A developed accurate model would be patient specific and large amount of computations would be involved for the processing of images acquired from each patient.

Reference Image R

+

_

Deformed Template Image T [u k ]

Image Comparison

Observation Prediction Error Measurement Update

xˆ k START

yes

xˆ 0 xˆ k

1

FINISH

no

Time Update/Prediction

xˆ k

Predicted States of Deformation

ˆ u=deformation (x) Fig. 1. Block diagram of the iterative registration algorithm based on the concept of ^k is the vector of estimated states of deformation, and Jk dynamic state estimation. x is the distance metric.

90

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

tion are estimated in two main steps. In the first step, an a priori ^ estimate of the states, i.e., x k , is predicted using the deformation model given the knowledge of the system prior to that time, i.e., ^k1 . The predicted deformation of the FE mesh u x k is computed using the predicted states of the tissue deformation. At each iteration, the reference image R, which is sub-sampled from intra-treatment images, is compared to the deformed template image T½u k , which is interpolated from pre-treatment images at predicted image grid points. The displacement field of the image grid points is calculated based on the predicted nodal point displacements of the FE mesh. T½u k  is an image, which is predicted based on the deformation dynamics of the tissue. The deformed template image T½u k  is then compared to the reference image R using an image matching criterion. In the second step, an observation prediction error is computed based on the image measurements. This error, which is also called innovation residual (Welch and Bishop, 1995), is used to update the a priori estimates and compute an a posteriori ^k . estimate of the deformation states, i.e., x 2.2. Linear elastic FEM-based deformation model The total potential energy of an elastic body is sum of its strain– stress energy and the potential energy of the external loads (Zienkewickz and Taylor, 1987). When at a static equilibrium, the total potential energy of deformable body is at a minimum. In other words, the internal strain–stress forces of the elastic body are in a balance with the external applied forces at the steady-state equilibrium. Using the concept of finite element analysis, the steady shape of a linear elastic body under external forces can be obtained by solving a set of equilibrium equations as (Zienkewickz and Taylor, 1987; Bathe, 1996)

Ku ¼ f

ð2Þ

where K is the global stiffness matrix associated with the 3D FE mesh and is numerically integrated over the volume of the elastic volume, u is the vector of all nodal point displacements, and f is the vector of forces that are concentrated at the nodal points of the FE mesh. Most anatomical structures in the human body are deformable and continuously undergo nonrigid motion due to interactions with the surrounding environment. Hence, dynamic models are valuable in medical image analysis applications. They have intuitive meaningful dynamical behavior and can be used for the estimation of the deformation of soft tissue over time (McInerney and Terzopoulos, 1996). In image registration applications, one might also be interested in the transient motion and deformation behavior of the object. To this end, the inertial body forces and energy dissipation through velocity dependent damping forces are added to the static equilibrium Eq. (2), resulting in the following set of second-order differential equations for the nodal point displacements of the FE mesh (Bathe, 1996; McInerney and Terzopoulos, 1996).

€ þ Cu_ þ Ku ¼ f Mu

ð3Þ

here M is the mass matrix of the elements concentrated at nodes, and C ¼ aM þ bK is the Rayleigh damping matrix. a and b are chosen for a critically damped response for Eq. (3) ensuring a fast response without transient oscillations. Differential equations in (3) can be solved over time using existing implicit or explicit numerical integration routines some of which are explained in (Bathe, 1996). To reduce the computation, very fast modes of the dynamical system might be isolated and discarded without affecting the response at a particular time scale relevant to the application of interest. For this purpose, Bathe (1996) transforms the dynamics equations using a new variable u , /x,

where columns of / are eigenvectors of M1 K. With this change of variables, Eq. (3) can be written as

~ x_ þ Kx ~x ~ ¼ ~f €þC M

ð4Þ

~ ¼ /t C/; K ~ ¼ /t M/; C ~ ¼ /t K/ are diagonal matrices, and where M ~f ¼ /t f. Now, the dynamic finite element equations are decoupled in Eq. (4) and each equation describes a pair of vibrational modes of the deformable body. In the proposed registration method of this paper, the 3D FE mesh is not constrained; so, it would be able to move and deform at the same time. This would allow for simultaneous rigid and deformable transformation of the body. As a result, the dynamics of the 3D body in Eq. (4) includes six coupled equations accounting for rigid body motion, which determine the position and orientation of the object. Remainder modes are all decoupled and can be solved independently. Modes that are very fast compared to the time step used for registration and the rate of changes in external forces (in case of a dynamic deformation of the tissue) can be simply eliminated from the system dynamics. The modal reduction basically projects a full model behavior onto a subspace of lower dimensionality, resulting in significant reduction of the computations of the registration algorithm. Working with a reduced number of states with slower dynamics improves the efficiency and robustness of the state estimation process as well. The state estimation-based registration algorithm is implemented in discrete time. Therefore, we transformed the continuous-time dynamics of the deformation in (4) into discrete time with a sampling time of T s using the central difference method (Bathe, 1996). The discrete-time dynamics of the deformation model can be represented as a set of linear stochastic difference equations similar to Eq. (1a), in which





0

I

W1 1 W3

W1 1 W2



; B¼



0



W1 1

ð5Þ

with

W1 ¼

~m M T 2s

þ

~m ~ ~ ~ C ~ m  2Mm ; W3 ¼ Mm  Cm ; W2 ¼ K 2 2 2T s 2T s Ts Ts

ð6Þ

~m ; K ~ and K ~ m; C ~ m are m  m matrices out of the original M; ~ C, ~ where M matrices corresponding to m slowest modal pairs of the full deformation model. 2.3. Observation model The observation model relates the estimated states of the tissue deformation to the actual measurement vector. It produces the predicted deformed pre-treatment images based on the predicted deformation state estimates. The predicted images are compared to the actual measurements (intra-treatment images in this case) to compute the observation prediction error. The predicted state estimates are then updated based on the observation prediction error. Comparing images based on their intensities or other processed features involves a nonlinear mapping from the deformation states to the observation vector. A nonlinear observation model requires application of nonlinear estimation techniques, which are less robust than their linear counterparts and have more computations. In this paper, the displacements computed at a set of control points, i.e., xc , are considered as the measurement vector, z. Control points are evenly distributed inside the FE mesh and their spacing is larger than image voxels. Deformation states are mapped to the nodal point displacement of the FE mesh using the relation u ¼ /m x in which /m is formed of m columns of / that correspond to m slowest modal pairs of the full deformation model. Displacements at control points can also be computed based on the nodal

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

displacements of the FE mesh based on the elements linear interpolation (shape) functions (Zienkewickz and Taylor, 1987; Bathe, 1996), i.e.,

z ¼ Ku ¼ K/m x

ð7Þ

where K is a matrix assembled from the elements shape functions for individual control points. Therefore, the observation model can be represented with a linear expression similar to Eq. (1b) in which the output matrix is defined as

H ¼ K/m

ð8Þ

2.4. State estimation Eqs. (1a) and (1b) are in the standard state-space form for the application of a discrete-time linear state estimator. Kalman filter (Welch and Bishop, 1995) is a well-known estimator that operates recursively on a stream of noisy measurements over time to produce statistically optimal estimates of the unknown states of the system. An extended Kalman filtering algorithm was employed in (Pennec and Thirion, 1997) to estimate a rigid transformation from a set of 3D matched points or matched frames. Moghari and Abolmaesumi (2007) presented a registration algorithm to map two point data sets in the presence of isotropic/anisotropic Gaussian noise by estimating rigid transformation parameters using the unscented Kalman filter. A diffeomorphic registration using particle filtering was also presented in (Gao et al., 2012) to estimate nonlinear deformation between two point sets. The iterative Kalman filtering-based estimation process using a linear elastic deformation model involves two different updates. First, the time update provides an a priori estimate of the states, ^ i.e., x k given knowledge of the process prior to that time. External applied forces on the deformable tissue in Eq. (1a) are usually unknown. Here, we consider them as a part of the process noise/ disturbance w and model them as white Gaussian noise with a normal probability distribution of pð~fÞ ¼ Nð0; Q Þ. The power of the process noise reflects our confidence in the accuracy of the model. Therefore, uncertainties in the model and input forces can all be considered in a suitably strong power for the process noise/disturbance. Therefore, the time update (prediction) equations can be expressed as (Welch and Bishop, 1995)

^k ¼ Ax ^k1 x

ð9Þ

where A is the state transition matrix defined in Eq. (5).

(a) Intra-treatment prostate MR image

91

In the second step, the measurement update computes an a pos^k , as a linear combination of the a teriori estimate of the states, i.e., x ^ priori state estimates x and a weighted difference between the k actual measurement vector zk and the vector of predicted measurements (Welch and Bishop, 1995), i.e.,

^k ¼ x ^k þ Cðzk  Hx ^k Þ x

ð10Þ

^ where C is the Kalman filter gain and zk  Hx k is the observation prediction error or the innovation residual. In the proposed method of this paper, the Kalman gain C is computed once and the same gain is used for the registration of different image sets. To this end, control points at which the observation prediction errors are computed, are evenly distributed inside the finite element mesh and their position is fixed in the registration of different image data sets. Fig. 2a shows an axial view of the prostate intra-treatment (1.5 T) MR images. The intra-treatment image volume is scaled and transformed to be placed inside the FE mesh. As shown in Fig. 2b, this is carried out in a way that the center of the image volume is placed in the center of the FE mesh volume and the domain of the image fits the mesh volume in its largest axis, e.g., horizontal axis in this case. With this configuration, the output matrix H in the observation model would be independent from the image domain in the registration of different image sets. Therefore, the deformation and observation models in Eqs. (1a) and (1b) provide a linear time-invariant observer for the prostate deformation. A better alternative for the regular grid of control points would be some important points/features inside the prostate gland that can be extracted from intra-treatment images. However, it requires pre-processing of the images to find those important points. Also, considering a regular grid inside the FE mesh as control points makes it possible to compute the Kalman gain once and employ the same gain for the registration of all image pairs. If control points are computed based on the images, it would require to compute the Kalman gain for each image pair separately. In such case, the steady-state Kalman gain C can be computed given process Q and measurement noise S covariance matrices. For details on Kalman filtering and steady-state gain computation see (Welch and Bishop, 1995; Franklin et al., 1997). Q and S are given as

Q ¼ qI2m2m ; S ¼ sI3n3n

ð11Þ

here, I is an identity matrix, m is the number of vibrational modal pairs included in the deformation model, and n is the total number

(b) FE mesh with control points

Fig. 2. A 2D view of control points spacing inside the FE mesh. (a) An axial view of intra-treatment prostate MR image, (b) scaled prostate inside the FE mesh with control points.

92

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

of control points inside the FE mesh; q and s determine the power for the process and measurement noise, respectively. It is the relative power of the process (uncertainties in the modeling and unknown external forces) and measurement noises, i.e., q=s, that determines the Kalman gain. The choice of q=s depends on our relative confidence in the deformation model versus the observation obtained from comparing pre- and intra-treatment images. Since the linear elastic FE model is developed for small deformations, it would not be accurate for large deformations. In this case a larger q=s should be considered to account for the modeling uncertainty and inaccuracy. Noisy images with artifacts especially in multimodality registration would increase the uncertainties in the computation of the observation prediction error. In this case, q=s needs to be decreased to accounts for the effect of the uncertainties in image measurements. For the experiment of this paper the value of q=s was chosen based on a leave-one-out approach or an example case. Although the certainty of the measurements is higher in the mid-gland as well as the boundaries of the prostate, determination of prostate sub-regions and boundaries require manual segmentation of the images. In this paper we tried to minimize the tedious manual intervention of the user in the registration process and considered the measurement S and process Q noise covariances as diagonal matrices, which give equal weights for all measurements within the volume to be registered and the deformation states. It should also be noted that although external applied forces ^f k1 are unknown, B is employed in the computation of the Kalman gain C as the input matrix for the applied noise/disturbances on the system. However, since ^f k1 is not deterministic, the term B^f k1 is not considered in the time update equation of Eq. (9). 2.5. Observation prediction error The measurement vector in Eq. (1b) is the vector of displacements at control points from their initial positions. For the measurement update in Eq. (10), instead of computing the ^ measurement vector zk and its prediction Hx k separately and subtracting them, a method is employed in this paper to directly ^ approximate the observation prediction error zk  Hx k using an intensity-based distance metric. Assuming that the distance metric is a convex function of the deformation field, a correction along the inverse gradient of the distance metric would represent a step in the gradient-descent search towards the minimum of the distance metric. This search direction is used here to approximate the observation prediction error in the state estimation-based registration method proposed in this paper. If we define JðR; T½u; xc Þ as an intensity-based distance metric between the reference R and the deformed template image T½u, which is locally defined at xc , the correction displacement can be computed based on the gradient of the distance metric at control points as

^k ¼ crJðR; T½uk ; xc Þ dxc ¼ zk  Hx

ð12Þ

^ where u k ¼ /m x is the predicted nodal point displacements of the FE mesh and T½u k  is the predicted deformed template image. c is the only parameter to be tuned for the registration of each pair of images. This tuning parameter determines the size of the observation prediction error in Eq. (12) computed in the direction of the gradient of the distance metric between the reference R and the predicted deformed template image T½u k . With this definition, uncertainties in the computation of the observation prediction error are considered as measurement noise in the filtering process. It should be noted that, dxc is computed at control points inside the image domain (i.e., dashed square in Fig. 2b) in the registration of different image sets. Therefore, in the measurement update of the

states, only columns of C corresponding to the control points inside the image domain are considered. 2.6. Distance metric In this paper, for the registration of pre-treatment 3 T T2 weighted MR images of the prostate to its 1.5 T intra-treatment images, a distance metric was employed, which locally compares modality independent neighborhood descriptors (MIND) in the images (Heinrich et al., 2012). MIND extracts distinctive structures in a local neighborhood based on the similarity of small image patches within the image. The extraction of structures using this method is independent from image modalities, contrast, noise and intensity levels of the images (Heinrich et al., 2012). However, it is sensitive to different types of image features such as corner points, edges and textures. MIND is basically a multi-dimensional image descriptor, which represents the distinctive image structures in a local neighborhood (Heinrich et al., 2012). It is extracted based on patch distances in each image separately (different modalities). Extracted descriptors are then compared using simple single-modality similarity/distance metrics such as SSD or normalized correlation coefficient (NCC). More details on MIND and its mathematical formulation can be found in (Heinrich et al., 2012). We also examined other similarity metrics such as NCC, correlation ratio (CR) and mutual information (MI) in the registration of prostate MR images. In the rigid registration, MIND-based registration provided more robust and less variable results to the selection of the initial approximate corresponding points in comparison to other similarity metrics. The proposed deformable registration algorithm employs a weighted gradient of a similarity/distance metric to compute the observation prediction error. To this end, as is given in Eq. (12), a weighted gradient of the SSD between MINDs in two images was numerically computed in the experiments of this paper. The performance of other similarity metrics degraded in the presence of image intensity inhomogeneities around the endorectal coil. However, MIND could automatically deal with intensity inhomogeneities and noise in the images and no pre-processing of the images was required. 2.7. Description of the registration algorithm In the first step, each set of pre- and intra-treatment images are rigidly co-registered. For the registration, the SSD of the MINDs between two images is minimized. In the optimization, the initial values for the rigid transformation parameters are computed based on aligning four pairs of manually identified approximate corresponding points in the images. As is shown in Fig. 3, these points are approximate corresponding points on the boundaries of the prostate gland, which are employed only for the initialization of the rigid registration and are not used to evaluate the registration results. For the experiments of this paper, all four points, i.e., point 1 to 4, were chosen sequentially on the same image plane. First, two similar axial planes in the pre- and intra-treatment images were chosen in the mid-gland region. Point 1 and 3 were selected on the most right and the most left side of the prostate image plane, respectively, on the boundary of the peripheral zone. Point 2 was chosen to be on the posterior boundary of the prostate close to the center line. Finally, point 4 was chosen on the anterior boundary of the prostate in a way that the line connecting point 2 and 4 divides the prostate to two semi-symmetric regions. Different inflation volume of the endorectal coil balloon in the pre- and intra-treatment images are also shown in Fig. 3, which puts different pressure on the prostate causing deformation in the PZ. For each set of images, we performed the intensity-based rigid and deformable registration in a small 3D volume that contains the prostate gland and discarded image information outside of this

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

Fig. 3. An axial view of the (a) pre-treatment and (b) intra-treatment MR images of the prostate with approximate corresponding points.

volume. In order to obtained the margins of the 3D volume containing the prostate gland, we used four identified points in the intra-treatment image (Fig. 3b) to find the right (point 1–2 mm), left (point 3 + 2 mm), posterior (point 2–5 mm) and anterior (point 4 + 5 mm) margins. One point on the level of apex and another point on the level of base were chosen in the intra-treatment image to roughly determine the margins of the prostate volume in the axial direction. In the second step, the rigid transformation is integrated into the deformable registration algorithm. The proposed state estimation-based elastic registration algorithm involves the following steps at each iteration according to the algorithm flowchart given in Fig. 1. 1. Time update/prediction: The vector of state estimates from ^k1 is updated to x ^ the previous step x k according to the time update equation, i.e., Eq. (9). This is the vector of predicted deformation states of the tissue based on which the predicted displacements for the nodal points of the FE mesh ^ can be computed using u k ¼ /m xk . 2. Find the deformed grid of control points: The deformation for the grid of control points is computed based on the deformation of the FE mesh using elements shape functions. This would provide the predicted position of the control points in the coordinates of the template image T. 3. Compute the observation prediction error: The observation prediction error dxc is computed at the control points using Eq. (12). To this end, the gradient of the distance metric is numerically computed at the control points. 4. Measurement update: The state estimates are updated based ^k ¼ x ^ on the observation prediction error using x kþ Cdxc where Cis the steady-state Kalman gain. The algorithm terminates when the relative change in the distance metric between the reference R and the deformed template image T½u (i.e., JðR; T½uÞ) is less than a given small number e ¼ 104 , i.e., j Jk  Jk1 j =Jk < e, or the total iterations exceeds a maximum number (300 for the experiments of this paper). At the end of the iterations, a deformed template image T½u with the voxel size of the actual pre-treatment image (or intra-treatment image for comparison) is computed. The desired image grid is deformed based on the final estimated displacements for the nodal

93

points of the FE mesh. The deformed template image is then interpolated from the pre-treatment image volume. A hierarchical multi-level approach (three levels) was employed in the estimation of non-rigid deformation of the tissue. The iterative registration algorithm starts with low-resolution reference and template images to estimate approximately large deformations of the tissue. The image resolution increases in a coarse to fine fashion as the registration algorithm progresses to utilize more local detailed information of the images in the deformation estimation. The slice thickness in all levels is identical. The pixel size in the first and second registration levels is thrice and twice that of the intra-treatment images, respectively. In order to compute the deformed image based on the deformation of the FE mesh using the finite elements shape function, it is necessary to run a search algorithm to find elements in which image grid points are located. This process using available search methods requires a long processing time, specially for high-resolution images. Hence, thin-plate splines (TPS) interpolation method (Modersitzki, 2009) was used in the proposed registration method to compute images based on the FE mesh deformation, which requires no search algorithm. Using TPS, displacements at nodal points of the FE mesh are smoothly distributed to image grid points. At the end of the iterative process, the deformation model reaches an equilibrium with a nodal point displacements vector of u. It can be shown that the steady-state equilibrium equation for the deformation model becomes

Ku ¼ cMC2 rJðR; T½uk ; xc Þ

ð13Þ

where C ¼ ½C1 ; C2 . Comparing this equation with Eq. (2), it can be concluded that at the steady-state equilibrium, the internal strain–stress forces of the model are in a balance with the external applied forces, which are derived from the distance metric between images. The tuning parameter c in Eq. (13) is basically equivalent to the weighting factor of the image distance metric against the regularization function in the most optimization-based registration methods. For the experiment of this paper, the tuned values for c was 1.5–6. Given the maximum number of iterations, we intuitively tuned this parameter considering how the distance metric, the deformed template image and the FE mesh were changing throughout the iterations. Similar to other registration methods automatic tuning of c is very important that requires further investigation. 2.8. Materials The proposed registration algorithm was evaluated in registering pre-treatment 3 T T2-weighted MR images of the prostate with identified tumors to its intra-treatment 1.5 T images. Seventeen patients with low-risk prostate cancer underwent pre-treatment 3 T MR imaging using an endorectal coil. They were enrolled in an ongoing University Health Network Research Ethics Board approved Phase I clinical trial of MRI-guided focal laser thermal ablation therapy (Clinicial–Trials.gov ID: NCT01094665) (Cepek et al., 2013a). The patients gave informed consent for MRI-guided focal laser ablation therapy. Pre-treatment images were acquired using a GE Discovery MR750 scanner and patients were placed in supine position inside the closed-bore magnet. An endorectal coil was employed to acquire images in both pre- and intra-treatment imaging sessions. The 3 T T2-weighted images were turbo spin echo (TSE) images with the parameters, field-of-view: 89180 mm, matrix: 222448, pixel size: 0:40180:4018 mm2, slice thickness: 3 mm, gap between slices: 0, number of signal averages: 2, repetition time: 3500 ms, echo time: 116 ms, flip angle: 120 . Each set of images was acquired with bandwidth of 51.5 kHz (115 Hz/pixel).

94

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

Intra-treatment imaging was performed using a GE Signa HDxt 1.5 T scanner just before inserting the needle guide for laser ablation with patients in the semi-lithotomy position in the closed-bore magnet. The 1.5 T T-weighted images were fast-spin-echo (FSE) images with the following parameters, field-of-view: 150  150 mm, matrix: 256  256, pixel size: 0:5859  0:5859 mm2, slice thickness: 3 mm, gap between slices: 0, number of signal averages: 1, repetition time: 4800 ms, echo time: 82.32 ms, flip angle: 90 , receiver bandwidth: 62.5 kHz (244.1 Hz/pixel). 2.9. Deformation model and registration parameters A cubic mesh with 10,476 tetrahedral elements and 2093 nodal points was employed to construct the deformation model. Using this mesh, an isotropic dynamic linear elastic model of deformation with the Young’s elasticity modulus E ¼ 10 kPa, Poisson’s ratio m ¼ 0:49 (Salcudean et al., 2012) and a mass density of q ¼ 1 g=cm3 (Torlakovic et al., 2005) was created. Different elasticity properties are reported for the normal and cancerous tissues in the literature (Zhang et al., 2008). Elasticity of the prostate tissue is also compared to the surrounding tissue in (Nir et al., 2013). In our work, since a homogeneous elastic model was employed for the whole prostate volume and small region outside, an average value of E ¼ 10 kPa was considered for the Young’s elasticity modulus. It should be noted that, in the proposed registration method in this paper, the uncertainty in the deformation modeling is considered as the process disturbance (noise) in the state estimation framework. The size of the reduced dynamic model is m ¼ 1000 where only 15:9% of vibrational modes of the full dynamic FE model were considered in the deformation model. A regular grid of 30  30  15 was also used as control points inside the cube of finite elements to compute the observation prediction error given in Eq. (12). The undeformed 3D FE mesh and the deformed mesh after registration are given in Fig. 4 for an example case. For the experiments in this paper, q=s ¼ 10 and c was manually tuned for different image data sets to achieve a desirable registration outcome. In the computation of MIND-based distance metric p ¼ 1, which gives a patch P size of 3  3  3 ¼ 27 voxels at every point. 2.10. Evaluation methods Registration results in this paper were evaluated and analyzed using different metrics. To compute the target registration error (TRE) (Fitzpatrick and West, 2001; Hajnal and Hill, 2010) matching

fiducial points were manually identified in all image pairs. The radiologist (S. Ghoul) selected fiducial points for each pair of images in both CG and PZ using anatomical landmarks of the prostate (e.g., the verumontanum or a tip of a protrusion in the pseudocapsule) and histopatholgical points (e.g., the mid anterior wall of a cyst or mass, or the most medial end of a hypointense band in the PZ). In total 83 intrinsic fiducial pairs were identified, of which 40 were within the PZ where about 70% of prostate cancers are found (McNeal et al., 1988). In order to evaluate directional anisotropy of the TRE, we performed a 3D principal component analysis (PCA) of the TREs set x x y y z z x defined as D ¼ ½f T½u  f R ; f T½u  f R ; f T½u  f R  where f T½u is the x component of the identified fiducial point in the template image (T), which is transformed to the coordinates of the reference image x (R) using the transformation/deformation field (u), and f R is the x component of the corresponding fiducial point in the reference y y z z image (R) coordinates. f T½u ; f R ; f T½u and f R are defined similarly in the y- and z-directions. Eigenvalues (ki ) and eigenvectors (ei ) of the covariance matrix of D provides us the principal components of the directional error set. The 95% standard error ellipsoid volumes (Irwin et al., 2008; Karnik et al., 2010) for 83 identified fiducial points in both rigid and deformable registrations were also computed. These volumes were defined with their semi-principal axes parallel to the eigenvectors ei . The length of the semi-principal axes of the ellipsoid volumes ai are proportional to the root pffiffiffiffi square of the eigenvalues. For 95% confidence, ai ¼ va;m ki where va;m  2:80 is the probability distribution evaluated for a ¼ 0:05 and three degrees of freedom (m ¼ 3) (Karnik et al., 2010). Image registration is an ill-posed problem by itself (Modersitzki, 2009) and several (infinite) solutions may exist for a given registration problem. The cost function to be optimized in the intensity-based registration method, which is computed based on the template and reference images, is highly nonlinear. In the rigid registration, depending on the given initial values for the translational and rotational parameters, different solutions may be obtained. Therefore, selection of approximate corresponding points in our registration technique is noticeably important. To evaluate the variability of the registration results based on the initial approximate corresponding points, these four points were chosen five times in ten data sets by the user (the person in charge of the registration), with at least two days time difference between the point selection for each data set to avoid memory bias. The TREs for each selection of approximate corresponding points were computed and statistically compared to TREs resulted from selecting these points at other times.

50

50

40

z

40

30

z 20

30 20

10

10 50

0 0

40 30

10 20

20

x 30

10

40 50 0

(a)

y

50

0 40

0

30

10 20

x

20 30

y

10

40 50

0

(b)

Fig. 4. (a) Undeformed and (b) deformed FE meshes employed in the registration. Scales are all in mm.

95

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

# of fiducials 5

4

3

1

3

4

3

2

5

5

6

7

7

7

5

6

3

13

14

15

16

17

7 6

Rigid Deformable

TRE (mm)

5 4 3 2 1 0

1

2

3

4

5

6

7

8

9

10

11

12

patient # Fig. 5. Average and std of the TREs in the rigid and deformable registration of pre- and intra-treatment prostate MR images for 17 patients.

We also evaluated registration accuracy based on the Dice similarity coefficient (DSC) (Dice, 1945) for both rigid and deformable registrations. The DSC measure is defined as:

DSCðR; T½uÞ ¼

2ðR \ T½uÞ R þ T½u

ð14Þ

where R and T½u are the prostate volume in the reference and registered template images, respectively, and ðR \ T½uÞ is the common volume between the reference R and the registered template T½u prostate images. Both pre- and intra-treatment images were manually segmented by a radiologist to identify prostate volumes and region boundaries. The volume of the CG (V CG ) and WG (V WG ) of the prostate were determined based on the segmentation, and the PZ volume was computed based on the volume difference between the WG and CG. The DSC was computed for different sub-regions of the prostate volume to evaluate the registration in the CG, PZ, and WG as well as base, mid-gland and apex (30%; 40% and 30% of the base-apex axis length, respectively) of the prostate. In addition, we calculated the Precision (specificity) and Recall (sensitivity) (Martin et al., 2004) in both rigid and deformable registrations. They provide metrics for segmentation quality as they are sensitive to over and under-segmentation if both considered together. Over-segmentation results in low Precision scores, while under-segmentation results in low Recall scores (Estrada, 2005). Comparing two registered volumes, both Precision and Recall can have high values if the boundaries in both volumes are in same spatial locations and their details match. For the reference prostate volume R and registered template prostate volume T½u, Precision and Recall are defined as

PrecisionðR; T½uÞ ¼

RecallðR; T½uÞ ¼

ðR \ T½uÞ T½u

ð15Þ

ðR \ T½uÞ R

ð16Þ

It should be noted that Precision and Recall are related to DSC as

DSCðR; T½uÞ ¼

2PrecisionðR; T½uÞRecallðR; T½uÞ PrecisionðR; T½uÞ þ RecallðR; T½uÞ

ð17Þ

Hence, in this paper we evaluated the quality of the registration based on these two coupled measures.

To further analyze the registration results mean absolute distances (MAD) (Qiu et al., 2012) between surface in the CG and WG were computed for each image pair. The functions defining MAD between surfaces is not symmetric. Hence, an average value, i.e., ðf ðR; T½uÞ þ f ðT½u; RÞÞ=2, was computed and reported in this paper.

3. Results 3.1. Target registration error A summary of target registration errors (TREs) in the registration of 17 prostate image pairs is given in Fig. 5. In each bar, the square (rigid) or circle (deformable) marker in the middle shows the average TRE, and the distance above or below the marker defines the standard deviation (std) of TREs for each case. The median of TREs is shown with a short dash on the bar. The number of identified fiducial pairs for each case is also given in this figure. The average TRE is reduced in 16 cases after deformable registration. We also computed the TRE in the CG, PZ, and WG of the prostate in all 17 cases and the total results are summarized in Table 1. Because of the variable inflation volume of the endorectal coil balloon between pre- and intra-treatment images, most of the deformation in the prostate occurs in the PZ. Furthermore, registration error (distance) of four approximate point pairs after rigid and deformable registration were computed and given in Table 1. Approximate points registration error (APRE) was 3:71  2:70 mm and 3:43  2:45 mm after rigid and deformable registration. The average and std of APRE in most cases were larger than TRE, showing that these points were not chosen accurately. However, APREs were small enough to lead to acceptable registration results. The TRE for 38 pairs of fiducial points in the CG was 2:95  1:43 mm after rigid registration. The TRE was 2:34  1:11 mm for other 38 pairs of fiducial points in the PZ after rigid registration. Better alignment in the PZ is partly due the fact that three out of four approximate matching points were selected in the boundary of the PZ, e.g., point 1, 2 and 3 in Fig. 3, the region where most of the prostate tumors are found. After applying the iterative elastic registration, the TRE improved to 2:03  0:94 mm

Table 1 Number of identified fiducial points and TRE in different sub-sections of the prostate images as well as approximate points registration errors (APRE). # of fiducials

Registration

CG

PZ

WG

38

38

76

Rigid Deformable

TRE (mm)

APRE (mm)

CG

PZ

WG

Median

Average ± std

Median

2:95  1:43 2:03  0:94

2:34  1:11 1:70  0:93

2:67  1:31 1:87  0:94

2.31 1.69

3:71  2:70 3:43  2:45

2.95 2.64

96

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103 5

PZ CG

4

y

0

2 0

0

1

2

3

4

5

6

0

0

−2

0

2

4

−4 −2

x

−5 0

2

4

−5

x

0

5

y

5

8

PZ CG

y

6 4

0

z

Deformable frequency

5

−5 −5 −4

TRE (mm)

2 0

5

5

5

0

0

z

frequency

6

z

Rigid

z

8

−5 0

1

2

3

4

5

6

TRE (mm)

−5 −4

−2

0

x

2

4

−4 −2

−5 0

2

4

x

−5

0

5

y

Fig. 6. Frequency distribution of the TREs in the PZ and CG after rigid and deformable registration. The number of fiducial pairs in both the PZ and CG is 38.

Fig. 8. 2D projections of the error ellipsoid for the rigid (top row) and deformable (bottom row) registrations. Scales are all in mm.

and 1:70  0:93 mm in the CG and PZ, respectively. For all 76 fiducial pairs in the WG, the TRE was 2:67  1:31 mm and 1:87  0:94 mm after rigid and deformable registration, respectively. Therefore, our proposed deformable registration approach resulted in about 0.8 mm improvement in the average TRE from that of the rigid registration. The frequency distribution of TREs in the CG and PZ resulting from the rigid (top) and deformable (bottom) registrations are given in Fig. 6. The average and std of TREs in the x-, y- and z-directions were 0:97  0:70 mm, 1:328  0:95 mm and 1:72  1:38 mm, respectively, after rigid registration. These values improved to 0:75  0:64 mm, 0:85  0:66 mm and 1:15  0:95 mm after deformable registration. Therefore, the average TRE in the axial image plane was less than twice the pixel spacing (pixel size is 0:5859  0:5859 mm2). Furthermore, the average TRE is the z-direction (superior-inferior) was less than half of the slice thickness (i.e., 3 mm). It is evident from first row of Fig. 6 that the distribution of TREs in deformable registration has a higher peak at small values (0.5– 3 mm). In contrast, the TREs from the rigid registration are distributed in a wider range (1–5.5 mm). We used the Shapiro–Wilk nor-

mality test using GraphPad Prism (http://graphpad.com) to test our data. The test results showed that TREs after both rigid and deformable registrations were not found to be sampled from a Gaussian population (P < 0.0001). Therefore, we used the Mann– Whitney (MW) non-parametric test to compare the distribution of TREs between the rigid and deformable registration. The P value was 0.0057 indicating that populations in two groups were distinct and the TREs after deformable registration were statistically significantly different from the TREs after rigid registration (i.e., P < 0.05). 3.2. Error anisotropy The 95% standard error ellipsoid volumes for 78 identified fiducial points in both rigid and deformable registrations are given in Fig. 7. The ellipsoid volume in the rigid registration was 0.32 cm3, which was reduced to 0.11 cm3 in the deformable registration. Furthermore, 2D projections of the error ellipsoid for the rigid and deformable registration are given in Fig. 8. In order to illustrate anisotropy in the error, the length of semi-principal axes

Fig. 7. The 95% standard error ellipsoid volumes for the (a) rigid and (b) deformable registration. Scales are all in mm.

97

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103 Table 2 The length of semi-principal axes of the ellipsoid volumes, eigenvalues (ki ) of the covariance matrix of D and their ratio. Registration

Length of axes

Rigid Deformable

Eigenvalues

Ratios

a1

a2

a3

k1

k2

k3

k3 =k1

k3 =k2

k2 =k1

3.14 2.22

4.41 2.86

5.57 4.02

1.26 0.73

2.49 1.21

3.97 2.40

3.15 3.28

1.59 1.97

1.98 1.66

5

100

Rigid

Deformable

4 3

DSC (%)

TRE

90

2

70

1 0

80

1

2

3

4

60

5

1

2

3

4

5

6

7

test #

8

9

10

11 12 13 14

15 16 17

patient #

(a) DSC in the WG

Fig. 9. Average and standard deviation of the TREs for five different selection of the approximate corresponding points in 10 cases.  and  are for the rigid and deformable registration, respectively.

100

Rigid

Deformable

3.3. Variability due to the selection of approximate corresponding points

80 70 60 50 1

3

4

5

6

7

8

9

10

11 12 13 14

15 16 17

(b) DSC in the CG 90

Rigid

Deformable

80 70 60 50 1

Average and standard deviation of the TRE in five different selection of approximate corresponding points for 10 cases are given in Fig. 9. This figure shows that the average and standard deviation of the TRE did not change significantly for different selection of approximate corresponding points (test). We also statistically analyzed registration results and compared the results of each test with others. Shapiro–Wilk normality test results showed that TREs after both rigid and deformations in each selection of approximate corresponding points (test) were not found to be sampled from a Gaussian population (P < 0.0001 in all 5 tests). Therefore, we used Mann–Whitney (MW) non-parametric test to compare the distribution of TREs between the rigid and deformable registrations in each selection of approximate corresponding points as well as between different selection of approximate corresponding points. Statistical analysis showed that TREs after deformable registration in each test were significantly different from the TREs after rigid registration (i.e., P < 0.05). However, the MW non-parametric test using any two pairs of TREs after rigid registration in five different tests (10 possible permutations) showed that rigid TREs in different selection of approximate corresponding points were not significantly different (i.e., P > 0.05). Similarly, TREs after deformable registration between different selection of approximate corresponding points (10 possible

2

patient #

DSC (%)

of the ellipsoid volumes, eigenvalues (ki ) of the covariance matrix of D and their ratio are given in Table 2 in the rigid and deformable registrations. Table 2 indicates that eigenvalues were reduced after deformable registration. a2 (anterior–posterior) which is along the endorectal coil pressure was reduced by about 1.6 mm. Moreover, error anisotropy was slightly improved in the deformable registration. It is evident from Figs. 7, 8 and Table 2 that the anisotropy of the TRE is predominantly in the z-direction (superior-inferior). In the focal laser ablation therapy performed in our center, the laser fiber (needle) is inserted in superior-inferior direction with slight lateral angulation in some cases (Cepek et al., 2013b). Therefore, needle placement error in the direction of insertion (z-direction), which is also the direction of imaging, is of less importance than other two directions (x- and y-directions).

DSC (%)

90

2

3

4

5

6

7

8

9

10

11 12 13 14

15 16 17

patient #

(c) DSC in the PZ Fig. 10. DSC in the WG, CG and PZ of the prostate images after rigid and deformable registration.

permutations) were not significantly different. It should be noted that deformable registration parameters and the tuning gain c remained unchanged for all registration tests. 3.4. Volumetric measures The DSC metric between volumes of the WG, CG and PZ in 17 cases were computed after rigid and deformable registration and are shown in Fig. 10 as bar graphs. The DSC metric in the apex, mid-gland and base of the prostate volumes are also plotted in Fig. 11. Table 3 summarizes Figs. 10 and 11, which provides average and std of DSC in the WG, CG and PZ as well as apex, mid-gland and base after rigid and deformable registration. The average and std of the segmented CG and WG prostate volumes in the preand intra-treatment images are given in Table 4. Table 4 also provides the registration results based on the average and std of the

98

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

bladder. There is no Foley catheter in the pre-treatment images. This difference could also be partly due the segmentation error in intra-treatment images, which have poorer contrast at the prostate boundaries. The average DSC was improved in deformable registration for all sub-volumes of the prostate specially in the PZ and apex. Recall/Precision values were also showing similar results. Both average Recall and Precision metrics were improved in the CG, PZ and WG after deformable registration. While the average Recall (sensitivity) was decreased in the base of the prostate, the average Precision (specificity) was increased. However, the average Recall was increased in the mid-gland and apex and the average Precision was slightly decreased after deformable registration.

100

Rigid

Deformable

DSC (%)

90 80 70 60 50

1

2

3

4

5

6

7

8

9

10

11 12 13 14

15 16 17

patient #

(a) DSC in the apex 100

Rigid

Deformable

DSC (%)

95

3.5. Surface distances

90

Mean absolute distances (MAD) between the CG and WG surfaces after rigid and deformable registration are plotted in Fig. 12. Table 5 shows that both average and std of the MAD between surfaces were slightly decreased after deformable registration.

85 80 75 70

1

2

3

4

5

6

7

8

9

10

11 12 13 14

15 16 17

patient #

3.6. Qualitative results

(b) DSC in the mid-gland 100

Rigid

Deformable

DSC (%)

90

80 70

60

1

2

3

4

5

6

7

8

9

10

11 12 13 14

15 16 17

patient #

(c) DSC in the base Fig. 11. DSC in the apex, mid-gland and base of the prostate images after rigid and deformable registration.

coupled Recall/Precision metrics in different sub-volumes of the prostate images. The average volume of the CG and WG in the intra-treatment (1.5 T) images are larger than those in the pretreatment (3 T) images. We speculate that this is due to the enlargement caused by the existence of a Foley catheter in the prostate during the treatment process in order to drain out the

Axial views of the registered prostate images with tumor outlines in four different example cases are given in Fig. 13. In each row of this figure, pre-treatment images were rigidly (column (a)) and non-rigidly (column (c)) registered to intra-treatment images in column (b). Tumor outlines were identified by a radiologist in the pre-treatment images. These outlines were transformed and overlaid on the images based on the obtained rigid and non-rigid displacement field in column (a) and (c), respectively. The deformed tumor outlines are also overlaid on intratreatment image in column (b). As seen in this figure, for most cases the tumor cannot be differentiated from the surrounding healthy tissue in the intra-treatment images. These cases demonstrate the need for an algorithm that can accurately register pretreatment target volumes to intra-treatment images for MRIguided targeted biopsy or focal therapy of prostate cancer. Axial views of rigidly and non-rigidly registered images in the apex, mid-gland and base of the prostate are also given in Fig. 14 for an example case. In this figure, the results of rigid and deformable registration of pre-treatment images to intra-treatment images, i.e., column (b), are shown in column (a) and (c), respectively. Registration results in this figure are consistent with DSC (85.3%!90.1%) and Recall/Precision (84.1/86.6%!88.6/91.1%)

Table 3 Average and std of the DSC in different sub-sections of the prostate images. Registration

Rigid Deformable

DSC (%)

DSC (%)

CG

PZ

WG

Apex

Mid-gland

Base

83.4 ± 9.0 85.6 ± 7.6

64.1 ± 8.4 68.7 ± 6.9

86.3 ± 6.2 88.2 ± 5.3

80.3 ± 7.8 84.0 ± 6.0

90.9 ± 4.9 92.5 ± 4.3

83.4 ± 5.9 83.8 ± 5.2

Table 4 Prostate volumes in the CG and WG for pre- (3 T) and intra-treatment (1.5 T) images along with registration results based on Recall/Precision in different sub-sections of the prostate images. Imaging

Prostate volumes (cm3)

Registration

CG

WG

3T

25.7 ± 18.6

38.5 ± 19.5

Rigid

1.5 T

26.8 ± 21.9

39.9 ± 23.6

Deformable

Recall/precision (%)

Recall/precision (%)

CG

PZ

WG

Apex

Mid-gland

Base

82.3 ± 8.2/ 84.8 ± 10.9 85.0 ± 7.2/ 86.5 ± 9.4

63.4 ± 8.9/ 65.1 ± 9.0 68.2 ± 7.7/ 69.6 ± 7.5

85.3 ± 6.3/ 87.5 ± 7.3 87.5 ± 5.7/ 89.0 ± 6.3

78.3 ± 9.5/ 84.9 ± 13.7 86.2 ± 5.2/ 82.6 ± 10.1

88.6 ± 7.6/ 93.7 ± 4.0 91.8 ± 7.6/ 93.6 ± 3.4

84.0 ± 10.5/ 84.2 ± 7.6 79.6 ± 7.3/ 88.9 ± 6.5

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103 4

Rigid Deformable

MAD (mm)

3

2

1

0

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

patient #

(a) MAD in the CG 4

99

value for the tuning parameter c, the average computation time for the iterative elastic registration algorithm was about 2 min. (c) Post-registration tasks: After the iterations end, it required 26 s to compute the deformed pre-treatment image (high-resolution) based on the estimated states of the deformation. Furthermore, the time for finding the deformation of pre-treatment tumor outline and overlaying on the intra-treatment images was less than 5 s. Thus, the total time required was about 5 min to perform both rigid and deformable registration algorithms and compute the deformed registered image in a MATLAB implementation. A multi-thread C++ implementation of our registration technique showed that both rigid and non-rigid (real-time) registration routines can be performed in less than 20 s.

Rigid Deformable

MAD (mm)

3

4. Discussions 2

1

0

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

patient #

(b) MAD in the WG Fig. 12. The MAD between surfaces in the CG and WG after rigid and deformable registration.

Table 5 The MAD between the CG and WG surfaces of the prostate images. Registration

Rigid Deformable

MAD (mm) CG

WG

1.48 ± 0.64 1.27 ± 0.55

1.47 ± 0.64 1.26 ± 0.56

metrics in the WG of the prostate in patient number 8. The deformable registration performed better in the apex and mid-gland than what resulted in the base of the prostate. Although deformable registration improved Precision (specificity) in the base (72.1%!80.1%), Recall (sensitivity) was decreased in this section (95.0%!87.6%). 3.7. Computation time Computational tasks involved in the proposed registration method can be categorized in three groups: common tasks before starting the registration, which are performed once for all image sets, tasks for registering any set of images, and post-registration tasks. (a) Common tasks: The first group of tasks includes constructing the dynamic deformation model (forming stiffness, damping and mass matrices, and modal reduction) and computing the Kalman gain based on the given grid of control points. We implemented all steps of the registration algorithm using MATLAB on a 3.5 GHz Intel(R) Core(TM) i7-3970X processor with a 64.0 GB RAM. It required 4 min to construct the deformation model and 28 min to compute the Kalman gain for the model and measurement vector given in Section 2.9. Since an identical deformation model and vector of measurements are employed for the registration of any image pair, these tasks are performed once. (b) Registration tasks: Based on registration algorithm explained in Section 2.7, it required about 2 min to identify four approximate matching points in the boundaries of the prostate in two images, determine the approximate prostate region in the intra-treatment images and rigidly register two images. For a given

MR images of the prostate using endorectal coil are usually obtained when the prostate tissue is deformed. The registration technique proposed in this paper was employed to register T2weighted MR images of the prostate acquired at different phases of the cancer treatment at different magnetic field strengths. The rigid transformation obtained from the registration consisted of translations along three axes and rotations. However, in our image data set, computed rotations about the y (anterior–posterior) and z (superior–inferior) axes were negligible. There was a 5–40 degrees of rotation about x (right-left) axis between pre- and intra-treatment image volumes. The proposed non-rigid registration technique employs a generic dynamic deformation model to estimate the deformation of the prostate through a filtering process. Although the simple linear elastic deformation model employed in the registration does not accurately represent the geometry and physical properties of the prostate, its computational complexity in modeling and estimation is less than complex nonlinear models. In the state estimation framework, modeling and observation uncertainties can be considered as unknown process and measurement disturbances; this provides an acceptable trade-off between the accuracy and computations (time and costs). However, better registration results may be obtained using accurate nonlinear models, provided the applied forces to the model and boundary conditions can be computed from available image data. For instance, the geometry of the prostate and its sub-structures can be obtained based on the segmentation of the pre-treatment images, e.g., see (Bharatha et al., 2001). Accurate mechanical properties of the prostate tissue can be estimated via elastography. Moreover, applied boundary conditions can be computed based on intra-treatment image information and the user’s knowledge about the anatomical structures of the prostate and its surrounding organs. In the proposed registration method, a dynamic model is employed to estimate deformation of the prostate. Dynamic deformation models have intuitively meaningful behavior in the deformation estimation of anatomical structures in the human body, which continuously undergo non-rigid motion due to interaction with the surrounding environment over time. In the registration of pre-treatment images of a tissue to its real-time intra-treatment images, dynamic deformation models would provide a temporal and spatial correlation between images acquired at different times. An instance of the state estimation-based registration using a dynamic deformation model is developed in (Marami et al., 2014c) for dynamically tracking the deformation of a soft tissue based on registering pre-operative 3D MR images of a breast phantom to its real-time intra-operative 2D US images. In the application of this paper, however, prostate deformation is static. In this case, the registration algorithm converges to a steady-state solution, i.e., Eq. (13). Using a dynamic deformation

100

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

(a)

(b)

(c)

Fig. 13. Axial views of the prostate images with tumor outlines, (a) rigidly aligned pre-treatment image, (b) intra-treatment image, and (c) non-rigidly registered pretreatment image. These example cases are for patient number 8, 9, 12 and 14 from top to bottom, respectively.

model and discarding fast vibrational modes facilitates the process of state estimation helping the algorithm to smoothly converge in a few iteration. Estimating deformation states using a static model is very oscillatory during the iterations and the algorithm does not converge in most cases. Although the deformation of the prostate is static for the experiments of this paper, the system’s measurement vector (or the vector of observation prediction error) is changing through iterations as the estimated deformation changes, i.e., Eq. (12). The dynamic model allows us to recursively correlate image information over iterations and estimate required deformation field for matching pre- and intra-treatment images through the filtering process. In the state estimation for dynamical systems based on a particular set of measurements, the system has to be ‘‘observable’’ from those measurements (Chen, 1998). The observability of the system guarantees that states can be estimated based on the measurement vector computed at control points. Observability of the linear timeinvariant systems, similar to systems considered in this paper, can be investigated based on the rank of the observability matrix of the system (Chen, 1998). The system defined in this paper was observable even using a measurement vector defined in a small subset of control points.

The rigid registration approach employed for the experiment of this paper provided a good trade-off between the accuracy and required time and manual interventions. Because of the initial alignment using four manually selected approximate corresponding points on the boundaries of the prostate, the average TRE in the PZ was less than the CG after MIND-based rigid registration. The average TRE in the CG was improved by 0.9 mm after deformable registration, while the improvement was 0.6 mm in the PZ. This indicates that although the deformation in the PZ was larger than the CG, initial rigid alignment in the PZ was better than the CG. The TRE at 12 fiducial pairs out of the total 76 was increased after deformable registration (e.g., patient #2). The average increase in the TRE at these points was 0.35 mm. Other than the fiducial localization error (FLE), the poor performance of the registration method in a specific region of the prostate where the fiducial points were located could be the cause for the increase in the TRE. In the case of patient #2, for instance, fiducial points were located in the base of prostate. Also, most of the deformation between pre- and intra-treatment images occurred in the apex of the prostate. Deformable registration using the proposed method improved the quality of matching in the apex and WG of the

101

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

(a)

(b)

(c)

Fig. 14. Axial views of the prostate images in the apex (top row), mid-gland (middle row) and base (bottom row): (a) rigidly aligned pre-treatment image, (b) intra-treatment images, and (c) non-rigidly registered pre-treatment image. This example case is for patient number 8.

prostate. However, registration results after deformable registration was less good than those achieved by rigid registration in terms of the TRE and DSC in the base and mid-gland of the prostate. This suggests that the homogeneous linear elastic deformation model was less effective for the registration of images obtained from patient #2. Poor image measurements that are locally computed based on the employed distance metric in some regions of the prostate could be another reason for the bad performance of the deformable registration in those regions. In the registration of all 17 prostate MR image pairs, the SSD of MINDs between images was reduced after deformable registration. The amount of reduction was 3–10%. In one of the cases that we did not include in the paper, the prostate volume was increased by about 30% between pre- and intra-treatment imaging sessions. In this case the algorithm failed to register two images and the distance metric was increasing during the iterations rather than decreasing. The slope of change and its direction in the distance metric especially at the beginning of the iterative registration algorithm (low resolution images) could be considered as a measure of trust in the algorithm. Unusual deformation of the FE mesh after registration could also be another failure sign for the registration algorithm. Many applications of image registration for prostate images have been reported in the literature during the last 15 years. For the FE-based deformable registration method proposed in (Bharatha et al., 2001), Bharatha et al. reported an average TRE of 1.01 mm and 0.73 mm in the PZ and CG for one manually identified point in each section. This was an improvement over 4.20 mm and 1.76 mm in the PZ and CG using only rigid registration. An MIbased registration method using TPS transformation was applied to the registration of MR pelvic volumes including the prostate in (Fei et al., 2003). The method reduced the prostate centroid mis-

alignment to 0.6 mm from 3.4 mm using only rigid registration. Wu et al. (Wu et al., 2004) used a MI-based elastic registration method for the registration of prostate MR spectroscopy images with and without an endorectal coil and they got the maximum discrepancy of 2 mm or less. For the B-spline-based non-rigid registration method proposed in (Oguro et al., 2009), Oguro et al. reported a TRE of 2:3  1:8 mm for registration of 16 prostate MR images. Also, DSC values (%) after deformable registration were 91, 89 and 79 in the WG, CG and PZ, respectively. Moreover, Zhang et al. (Zhang et al., 2011) reported a TRE of 1:1  0:4 mm for the elastic registration of prostate MR images. Fair comparison of different registration methods based on results given in the literature would not be possible because they use different approaches for image acquisition and evaluations. A cubic B-spline model-based registration method (Thévenaz and Unser, 2000) was implemented by our colleagues at the center for imaging technology commercialization (CIMTEC) using Insight Segmentation and Registration Toolkit (ITK). They applied the registration method on the same imaging data of this paper and used MI as a similarity metric between images. The registration results presented in our paper were far better than what they achieved from their registration technique in terms of TRE. The results from our registration technique are comparable to reported methods in the literature. However, our registration method provides an acceptable trade-off between the registration accuracy and the computational effort (time and cost), and it requires minimal manual intervention of the user. Prostate tumors with a volume size greater than 0.5 cm3 are considered to be clinically significant (Epstein et al., 2005; Wolters et al., 2011). Ven et al. in (van de Ven et al., 2011) showed that the required accuracy (TRE) for hitting 90% of aggressive spherical shape tumors in image-guided targeted biopsy is 2.8 mm. Their estimation in (van de Ven et al.,

102

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103

2013) showed that for correctly grading 95% of the PZ prostate cancers in the MRI-targeted TRUS-guided biopsy, the TRE should be less than 1.9 mm. The average TRE in our proposed method was 1.9 mm, which indicates that the accuracy of the registration technique is in acceptable range for MRI-guided biopsy or targeted focal ablation therapy of clinically significant prostate tumors. 5. Conclusions We proposed a combined rigid and deformable registration method to align pre-treatment 3 T T2-weighted MR images of the prostate with intra-treatment 1.5 T T2-weighted MR images. The proposed elastic registration method estimates the deformation states of the prostate using a generic linear elastic FEM-based model through a filtering process. Both rigid and deformable registrations employ an intensity-based similarity metric, namely MIND. The registration algorithm does not require the segmentation of the prostate boundaries and involves minimal manual intervention of the user. Experimental results showed that, the proposed deformable registration method improves the quality of matching in all sub-regions of the prostate especially in the PZ where the endorectal coil deforms the prostate during the imaging sessions. Most of the computations involved in the registration algorithm are highly amendable to parallel implementation using graphics processing units (GPUs). An earlier version of the registration algorithm was implemented on a GPU (Mousazadeh, 2011) and a 38fold speedup was achieved over an optimized C-based CPU implementation. Implementing the proposed algorithm on a GPU architecture to speedup the registration method is a research path for the future. Acknowledgments The authors gratefully acknowledge funding from the Ontario Institute for Cancer Research (OICR) and Canadian Institutes of Health Research (CIHR). A. Fenster holds a Canada Research Chair in Biomedical Engineering, and acknowledges the support of the Canada Research Chair Program. The authors would like to thank Dr. Eli Gibson, Dr. Tharindu De Silva and Yue Sun for their valuable comments and helps with this work. References Alterovitz, R., Goldberg, K., Pouliot, J., Hsu, I.C.J., Kim, Y., Noworolski, S.M., Kurhanewicz, J., 2006. Registration of MR prostate images with biomechanical modeling and nonlinear parameter estimation. Med. Phys. 33, 446. Bathe, K., 1996. Finite Element Procedures. Prentice Hall, Englewood Cliffs, NJ. Bharatha, A., Hirose, M., Hata, N., Warfield, S.K., Ferrant, M., Zou, K.H., SuarezSantana, E., Ruiz-Alzola, J., D’Amico, A., Cormack, R.A., et al., 2001. Evaluation of three-dimensional finite element-based deformable registration of pre- and intraoperative prostate imaging. Med. Phys. 28, 2551. Brock, K., Sharpe, M., Dawson, L., Kim, S., Jaffray, D., 2005. Accuracy of finite element model-based multi-organ deformable image registration. Med. Phys. 32, 1647. Brock, K.K., Nichol, A.M., Ménard, C., Moseley, J.L., Warde, P.R., Catton, C.N., Jaffray, D.A., 2008. Accuracy and sensitivity of finite element model-based deformable registration of the prostate. Med. Phys. 35, 4019. Cepek, J., Chronik, B.A., Lindner, U., Trachtenberg, J., Davidson, S.R., Bax, J., Fenster, A., 2013a. A system for MRI-guided transperineal delivery of needles to the prostate for focal therapy. Med. Phys. 40, 012304. Cepek, J., Lindner, U., Davidson, S.R., Haider, M.A., Ghai, S., Trachtenberg, J., Fenster, A., 2013b. Treatment planning for prostate focal laser ablation in the face of needle placement uncertainty. Med. Phys. 41, 013301. Chen, C.T., 1998. Linear System Theory and Design. Oxford University Press, Inc. Dice, L.R., 1945. Measures of the amount of ecologic association between species. Ecology 26, 297–302. Epstein, J.I., Sanderson, H., Carter, H.B., Scharfstein, D.O., 2005. Utility of saturation biopsy to predict insignificant cancer at radical prostatectomy. Urology 66, 356–360. Estrada, F.J., 2005. Advances in computational image segmentation and perceptual grouping. Ph.D. thesis. University of Toronto.

Fei, B., Kemper, C., Wilson, D.L., 2003. A comparative study of warping and rigid body registration for the prostate and pelvic MR volumes. Comput. Med. Imag. Graph. 27, 267–281. Fitzpatrick, J.M., West, J.B., 2001. The distribution of target registration error in rigid-body point-based registration. IEEE Trans. Med. Imag. 20, 917–927. Franklin, G.F., Workman, M.L., Powell, D., 1997. Digital Control of Dynamic Systems. Addison-Wesley Longman Publishing Co., Inc. Gao, Y., Rathi, Y., Bouix, S., Tannenbaum, A., 2012. Filtering in the diffeomorphism group and the registration of point sets. IEEE Trans. Image Process. 21, 4383– 4396. Hajnal, J.V., Hill, D.L., 2010. Medical Image Registration. CRC press. Hata, N., Jinzaki, M., Kacher, D., Cormak, R., Gering, D., Nabavi, A., Silverman, S.G., D’Amico, A.V., Kikinis, R., Jolesz, F.A., et al., 2001. MR imaging-guided prostate biopsy with surgical navigation software: device validation and feasibility. Radiology 220, 263–268. Heinrich, M.P., Jenkinson, M., Bhushan, M., Matin, T., Gleeson, F.V., Brady, S.M., Schnabel, J.A., 2012. MIND: modality independent neighbourhood descriptor for multi-modal deformable registration. Med. Image Anal. 16, 1423–1435. Hensel, J.M., Ménard, C., Chung, P.W., Milosevic, M.F., Kirilova, A., Moseley, J.L., Haider, M.A., Brock, K.K., 2007. Development of multiorgan finite elementbased prostate deformation model enabling registration of endorectal coil magnetic resonance imaging for radiotherapy planning. Int. J. Radiation Oncol. Biol. Phys. 68, 1522–1528. Irwin, M.R., Downey, D.B., Gardi, L., Fenster, A., 2008. Registered 3D ultrasound and digital stereotactic mammography for breast biopsy guidance. IEEE Trans. Med. Imag. 27, 391–401. Karnik, V.V., Fenster, A., Bax, J., Gardi, L., Gyacskov, I., Montreuil, J., Romagnoli, C., Ward, A.D., 2010. Evaluation of inter-session 3D-TRUS to 3D-TRUS image registration for repeat prostate biopsies. In: Medical Image Computing and Computer-Assisted Intervention–MICCAI 2010. Springer, pp. 17–25. Kozlowski, P., Chang, S.D., Meng, R., Mädler, B., Bell, R., Jones, E.C., Goldenberg, S.L., 2010. Combined prostate diffusion tensor imaging and dynamic contrast enhanced MRI at 3T–quantitative correlation with biopsy. Magnetic Reson. Imag. 28, 621–628. Lindner, U., Lawrentschuk, N., Weersink, R.A., Davidson, S.R., Raz, O., Hlasny, E., Langer, D.L., Gertner, M.R., Van der Kwast, T., Haider, M.A., Trachtenberg, J., 2010. Focal laser ablation for prostate cancer followed by radical prostatectomy: validation of focal therapy and imaging accuracy. Eur. Urol. 57, 1111–1114. Lindner, U., Weersink, R., Haider, M., Gertner, M., Davidson, S., Atri, M., Wilson, B., Fenster, A., Trachtenberg, J., 2009. Image guided photothermal focal therapy for localized prostate cancer: phase I trial. J. Urol. 182, 1371–1377. Marami, B., Ghoul, S., Sirouspour, S., Capson, D.W., Davidson, S.R., Trachtenberg, J., Fenster, A., 2014a. Elastic registration of prostate MR images based on state estimation of dynamical systems. In: SPIE Medical Imaging, International Society for Optics and Photonics, pp. 90340F–90340F. Marami, B., Sirouspour, S., Capson, D.W., 2011a. Model-based 3D/2D deformable registration of MR images. In: 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC. IEEE, pp. 4880– 4883. Marami, B., Sirouspour, S., Capson, D.W., 2011b. Model-based deformable registration of preoperative 3D to intraoperative low-resolution 3D and 2D sequences of MR images. In: Medical Image Computing and Computer-Assisted Intervention–MICCAI 2011. Springer, pp. 460–467. Marami, B., Sirouspour, S., Capson, D.W., 2014b. Non-rigid registration of medical images based on estimation of deformation states. Phys. Med. Biol. 59, 6891. Marami, B., Sirouspour, S., Fenster, A., Capson, D.W., 2014c. Dynamic tracking of a deformable tissue based on 3D-2D MR-US image registration. In: SPIE Medical Imaging, International Society for Optics and Photonics, pp. 90360T–90360T. Martin, D.R., Fowlkes, C.C., Malik, J., 2004. Learning to detect natural image boundaries using local brightness, color, and texture cues. IEEE Trans. Pattern Anal. Machine Intell. 26, 530–549. McInerney, T., Terzopoulos, D., 1996. Deformable models in medical image analysis: a survey. Med. Image Anal. 1, 91–108. McNeal, J.E., Redwine, E.A., Freiha, F.S., Stamey, T.A., 1988. Zonal distribution of prostatic adenocarcinoma: correlation with histologic pattern and direction of spread. Am. J. Surgical Pathol. 12, 897–906. Modersitzki, J., 2009. FAIR: Flexible Algorithms for Image Registration, vol. 6. Society for Industrial and Applied Mathematics. Moghari, M.H., Abolmaesumi, P., 2007. Point-based rigid-body registration using an unscented Kalman filter. IEEE Trans. Med. Imag. 26, 1708–1728. Mousazadeh, M.H., 2011. Fast 3D deformable image registration on a GPU computing platform. Open Access Dissertations and Theses. Nir, G., Sahebjavaher, R., Kozlowski, P., Chang, S., Sinkus, R., Goldenberg, L., Salcudean, S., et al., 2013. Model-based registration of ex vivo and in vivo MRI of the prostate using elastography. IEEE Trans. Med. Imag. Oguro, S., Tokuda, J., Elhawary, H., Haker, S., Kikinis, R., Tempany, C., Hata, N., 2009. MRI signal intensity based B-Spline nonrigid registration for pre- and intraoperative imaging during prostate brachytherapy. J. Magnetic Reson. Imag. 30, 1052–1058. Oto, A., Sethi, I., Karczmar, G., McNichols, R., Ivancevic, M.K., Stadler, W.M., Watson, S., Eggener, S., 2013. MR imaging-guided focal laser ablation for prostate cancer: phase I trial. Radiology 267, 932–940. Pennec, X., Thirion, J.P., 1997. A framework for uncertainty and validation of 3D registration methods based on points and frames. Int. J. Comput. Vis. 25, 203– 229.

B. Marami et al. / Medical Image Analysis 21 (2015) 87–103 Qiu, W., Yuan, J., Ukwatta, E., Tessier, D., Fenster, A., 2012. Rotational-slice-based prostate segmentation using level set with shape constraint for 3D end-firing TRUS guided biopsy. In: Medical Image Computing and Computer-Assisted Intervention–MICCAI 2012. Springer, pp. 537–544. Raz, O., Haider, M.A., Davidson, S.R., Lindner, U., Hlasny, E., Weersink, R., Gertner, M.R., Kucharcyzk, W., McCluskey, S.A., Trachtenberg, J., 2010. Real-time magnetic resonance imaging-guided focal laser therapy in patients with lowrisk prostate cancer. Eur. Urol. 58, 173–177. Rueckert, D., Sonoda, L.I., Hayes, C., Hill, D.L., Leach, M.O., Hawkes, D.J., 1999. Nonrigid registration using free-form deformations: application to breast MR images. IEEE Trans. Med. Imag. 18, 712–721. Salcudean, S., Sahebjavaher, R., Goksel, O., Baghani, A., Mahdavi, S., Nir, G., Sinkus, R., Moradi, M., 2012. Biomechanical modeling of the prostate for procedure guidance and simulation. In: Soft Tissue Biomechanical Modeling for Computer Assisted Surgery. Springer, pp. 169–198. Tahmasebi, A.M., Sharifi, R., Agarwal, H.K., Turkbey, B., Bernardo, M., Choyke, P., Pinto, P., Wood, B., Kruecker, J., 2012. A statistical model-based technique for accounting for prostate gland deformation in endorectal coil-based MR imaging. In: 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, pp. 5412–5415. Tan, N., Margolis, D.J., McClure, T.D., Thomas, A., Finley, D.S., Reiter, R.E., Huang, J., Raman, S.S., 2012. Radical prostatectomy: value of prostate MRI in surgical planning. Abdominal Imag. 37, 664–674. Thévenaz, P., Unser, M., 2000. Optimization of mutual information for multiresolution image registration. IEEE Trans. Image Process. 9, 2083–2099. Torlakovic, G., Grover, V.K., Torlakovic, E., 2005. Easy method of assessing volume of prostate adenocarcinoma from estimated tumor area: using prostate tissue

103

density to bridge gap between percentage involvement and tumor volume. Croat. Med. J. 46, 423–428. van de Ven, W.J., Hulsbergen-van de Kaa, C.A., Hambrock, T., Barentsz, J.O., Huisman, H.J., 2013. Simulated required accuracy of image registration tools for targeting high-grade cancer components with prostate biopsies. Eur. Radiol. 23, 1401– 1407. van de Ven, W.J., Litjens, G.J., Barentsz, J.O., Hambrock, T., Huisman, H.J., 2011. Required accuracy of MR-US registration for prostate biopsies. In: Prostate Cancer Imaging. Image Analysis and Image-Guided Interventions. Springer, pp. 92–99. Welch, G., Bishop, G., 1995. An introduction to the Kalman filter. Wolters, T., Roobol, M.J., van Leeuwen, P.J., van den Bergh, R.C., Hoedemaeker, R.F., van Leenders, G.J., Schröder, F.H., van der Kwast, T.H., 2011. A critical analysis of the tumor volume threshold for clinically insignificant prostate cancer using a data set of a randomized screening trial. J. Urol. 185, 121–125. Wu, X., Dibiase, S.J., Gullapalli, R., Yu, C.X., 2004. Deformable image registration for the use of magnetic resonance spectroscopy in prostate treatment planning. Int. J. Radiat. Oncol. Biol. Phys. 58, 1577–1583. Zhang, B., Arola, D.D., Roys, S., Gullapalli, R.P., 2011. Three-dimensional elastic image registration based on strain energy minimization: application to prostate magnetic resonance imaging. J. Digit. Imag. 24, 573–585. Zhang, M., Nigwekar, P., Castaneda, B., Hoyt, K., Joseph, J.V., di Sant’Agnese, A., Messing, E.M., Strang, J.G., Rubens, D.J., Parker, K.J., 2008. Quantitative characterization of viscoelastic properties of human prostate correlated with histology. Ultrasound Med. Biol. 34, 1033–1042. Zienkewickz, O., Taylor, R., 1987. The Finite Element Method. McGraw Hill.