Accepted Manuscript Title: Biopsy Needle Tracking Technique in US Images Author: Joanna Czajkowska Bartłomiej Pyci´nski Jan Juszczyk Ewa Pietka PII: DOI: Reference:
S0895-6111(17)30066-6 http://dx.doi.org/doi:10.1016/j.compmedimag.2017.07.001 CMIG 1523
To appear in:
Computerized Medical Imaging and Graphics
Received date: Revised date: Accepted date:
2-3-2017 23-6-2017 18-7-2017
Please cite this article as: Joanna Czajkowska, Bartlomiej Pyci´nski, Jan Juszczyk, Ewa Pietka, Biopsy Needle Tracking Technique in US Images, (2017), http://dx.doi.org/10.1016/j.compmedimag.2017.07.001 This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Biopsy Needle Tracking Technique in US Images Joanna Czajkowskaa,∗, Bartłomiej Pyci´nskia , Jan Juszczyka , Ewa Pietkaa a
cr
ip t
Department of Computer Science and Medical Equipment Faculty of Biomedical Engineering, Silesian University of Technology ul. Roosevelta 40, 41-800 Zabrze, Poland
us
Abstract
d
M
an
Fast development of imaging techniques in last decades has offered the intra-operative visualization as the integral part of surgical tools. Therefore, on-going research activities still focus on efficient and robust analysis of ultrasound images. The paper meets these requirements targeting in detection of biopsy needle, estimation of the needle trajectory and tracking the needle tip motion inside the examined tissue. The developed novel method uses ultrasound data supported by elastography images. The investigated detection algorithm introduces Histogram of Oriented Gradients and image entropy, whereas the tracking part employs Hough transform, Gabor Filter and Kanade-Lucas-Tomasi optical flow estimation technique. The developed methodology is verified by the stereoscopic navigation system. The verification phase proves the accuracy of 3-5 mm and encourage the further improvement of the methods.
Ac ce p
te
Keywords: Ultrasound, Elastography, Histogram of Oriented Gradients, Detection, Object Tracking, Biopsy Needle, Hough Transform, Gabor Filter, Lucas-Kanade-Tomasi Optical Flow, Optical Tracker 1. Introduction
A biopsy procedure is the removal of a small amount of tissue for laboratory (histopathologic) examinations. Common needle biopsy procedures include fine-needle aspiration and core needle biopsy, where the tissue is obtained by insertion through the skin or the mucosa of a special type of needle. In any case, it is extremely important to perform the biopsy procedure as precisely as possible. Otherwise the injury of vital structures, like vessels or nerves, contiguous to the target point might occur. ∗
Corresponding author Email address:
[email protected] (Joanna Czajkowska)
Preprint submitted to Computerized Medical Imaging and Graphics
July 19, 2017
Page 1 of 25
us
cr
ip t
Therefore, the “blind” biopsy is getting less and less popular. Instead, the operator incorporates flouroscopy or ultrasound (US) imaging. The latter is more favourable, due to its low price, wide availability and possibility of positioning the needle in the US plane. However, if the image plane rotates, the needle may disappear. The goal of the current study is to develop a method able to detect and locate the biopsy needle whose visibility is limited. The detection may include the needle trajectory prediction as well as its location in the image (while being only partially visible). Once the target is marked, its distance from the needle tip is provided. The algorithms strongly depend on some preliminary assumptions, e.g. constant value of the angle of the needle or given entry point in the image. The accuracy provided by the system also depends on the tracking method, e.g. electromagnetic systems return significantly higher errors than optical devices [1]. Expected accuracy depends on the type of the procedure, dimensions of the target, the neighbouring structures etc.
Ac ce p
te
d
M
an
Prior Work in Biopsy Needle Detection and Tracking To avoid the tissue damage, the percutaneous procedures, commonly used in medical practice require a precise needle detection. The visibility of the needle plays an important role in a success of the biopsy. Therefore, the problem of the needle detection and tracking is widely explored in the literature. Due to low US image quality, low contrast and multiple artefacts it still remains unsolved and is the target of current researches [2, 3]. The most common and widely explored approach is an initial binarisation of the US image followed by further steps, e.g. fitting a straight line to segmented points [4]. Another popular technique, which focuses on long shape extraction, is a Hough transform [5], whose modifications include coarsefine strategy [6] or randomization [7]. In 3D US data analysis, the Hough transform, being not robust to noisy backgrounds, is often replaced by RANSAC algorithm [8]. However, for 2D images, a Hough transform still yields promising results. Kaya et al [2] and Pourtaherian et al [9] describe the attractive capabilities of Gabor filter in detecting the needle of specified thickness and angle of insertion. To improve the detection stability in tracking algorithms, the Gabor filtering results are followed by tracking procedures using Kalman filter [10] or Gradient Descent error minimization [9]. Also Chatelain et al. [11] track the needle in 3D US applying intensity differences technique combined with RANSAC algorithm Kalman filters. The problem of partially visible needle is addressed by applying the Histogram of Oriented Gradients (HoG) descriptor [12, 13]. It outperforms existing feature sets and finds lots of applications in different image recognition areas [14, 15] also in the medical image registration field [16]. The evaluation step employed in [13] has proven that the needle insertion distorts the tissue structure, and the algorithms dedicated to tracking the 2
Page 2 of 25
an
us
cr
ip t
moving objects on fixed background are not robust enough. Since in medical practice the percutaneous procedures are still supported mainly by 2D US imaging, our current study focuses on the needle detection and tracking in 2D US data. It outperforms the previous study [12, 13] focused on partially visible needles, in order to solve the problem of needle area detection. In the current approach images with both partially and well visible needles have been considered. Moreover, the trajectories of the needle are now estimated and in s well visible tool scenario, the needle tip tracking algorithm is introduced. The workflow employs the HoG feature extraction technique, Hough transform, Gabor filter and single point tracking algorithm [17]. It expands the existing set up by introducing the stereoscopic navigation system. The needle tip localisation given by the tracking algorithm and the navigation systems are combined. The paper is organized as follows. Section 2 outlines the methodology trajectory estimation in partially visible needles and the needle tip tracking in well visible objects. The navigation system is presented in Sec. 3 followed by the evaluation of both paths in Sec. 4. Discussion and conclusion summarize the study.
M
2. Methods
Ac ce p
te
d
The methodology is divided into two separate paths of the analysis, dedicated to different types of the data, acquired during the biopsy procedure (Figure 1). The partially visible needle case is subjected to the trajectory estimation, whereas a clear needle appearance in the image allows a more detailed analysis yielding the needle tip tracking during the biopsy procedure. In both paths, the first step is based on the entropy and Histogram of Oriented Gradients (HoG). This results in an accurate detection of: (1) the needle puncture moment (Section 2.1) and (2) the needle area (Section 2.2). In Figure 1 they are denoted as “Needle Puncture Detection” and “Needle Detection”, respectively. The novelty of the first path of the analysis is the estimated trajectory of the inserted needle (Section 2.3). The second path (Section 2.4) targets in biopsy needle tip. The “Needle Detection” step (Figure 1) is proceeded by Hough transform in order to find the tool and to improve the trajectory estimation technique. The estimated angle of insertion causes the region of interest reduction and makes it possible to construct the Gabor filter mask applied at the detection step. Finally, the eigenvalues of gradient image [18] make the detection of a needle tip possible. Its tracking is performed by employing Kanade-Lucas-Tomasi optical flow estimation method [17]. The experiment has been carried out on four different US phantoms built in house. The first two are jelly phantoms: One jelly phantom is of homogeneous jelly structure (1); 3
Page 3 of 25
Needle Puncture Detection
Path 2
Needle Detection
Trajectory Estimation
cr
Pre-processing
ip t
Path 1
Trajectory Calculation
us
Hardly Visible Needle
Needle Tip Detection
an
Needle Tip Tracking
M
Well Visible Needle
d
Figure 1: Flow chart of the two paths US image processing method
Ac ce p
te
the other one consists of four jelly layers of different acoustic impedance (2); phantoms (3) and (4) are turkey and porcine pieces, respectively. Due to their accessibility and US appearance similar to human body parts, they are often employed as reference for US image processing tool evaluation [19]. The turkey phantom is rather homogeneous, whereas the porcine reflects the fibrous structure. An Philips iU22 ultrasound device with linear L12-5 probe was used for images acquisition. To improve the obtained results and evaluate the algorithm, stereoscopic navigation system Polaris Vicra (Northern Digital Inc., ON, Canada) is used. Since the needle is not rigidly fixed to the handle, the position of the needle tip may not be found precisely, as described in Section 3. Thus, the detection accuracy is evaluated by combining the navigation system and needle tracker results. 2.1. Needle Puncture Detection The Needle Puncture Detection step combines the US elastography [20] with B-mode data and makes the detection of the tissue disorder, caused by the needle puncture possible. Thus, a global image feature reflecting its uniformity is employed. It is based on the 4
Page 4 of 25
ip t
entropy analysis of the acquired grey scale elastograms, obtained as the “Value” channel of RGB to HSV data conversion. The detection algorithm applies Weighted Fuzzy CMeans clustering technique [21] to identify US elastography recording before the needle touches the tissue.
Ac ce p
te
d
M
an
us
cr
2.2. Needle Detection The next step is performed on B-mode US image only and results in the region detection that contains the needle. Since the tissue deformation changes the local gradients [22], it can be detected without a precise knowledge of the needle position or tissue distortion details. Various image gradient descriptors including HoG and SIFT (Scale-Invariant Feature Transform) have been considered. Since the SIFT detector is usually used for matching local regions chosen by an interest point detector, whereas the HoG is applied in a sliding window fashion or object classification, the HoG descriptor will be employed in this step. To calculate the HoG feature vector the analysed image is divided into small regions called HoG cells, of the size of [s×s] pix, where s is the HoG scale, s ∈ {128, 64, 32, 16, 8, 4}, depending on the image resolution and the object size. A local histogram of gradient directions or edge orientations is calculated over the pixels of each cell. The feature vector is then estimated for each cell, in four steps [22]: (1) cell image normalization; (2) gradient image computing; (3) edge orientation histogram calculation and (4) local contrast normalization. The estimated HoG feature vectors of the training data (the US images registered just before needle puncture, see Sec. 2.1) and consecutive images are compared. The needle detection algorithm consists of two steps: (1) training procedure with HoG parameter estimation and (2) deformation area detection. The training procedure is performed on the B-mode US images acquired before the needle puncture and denoted as Ic . The average image Im serves as a HoG parameters definition, which describes the examined tissue. Then, two threshold vectors: hlmin and hlmax of HoGs estimated for each image cell l (l = 1, 2, . . . , M, where M is the number of HoG cells in image Ic (i), i = 1, 2, . . . , k), are given as: hlmin = min |hlm − hlc (i)|
(1)
hlmax = max |hlm − hlc (i)|,
(2)
i=1,...,k
and
i=1,...,k
where hlm and hlc (i) are the HoG features of the average image Im and i-th training frame, respectively. To detect the needle area, the HoG features are then estimated for each Bmode US frame labelled as the state after the needle puncture, and compared with the 5
Page 5 of 25
ip t cr us
an
Figure 2: Exemplary detection results (left) compared with a manual needle delineation (right)
training data. The new feature vector vlI estimated for each cell of the analysed US image is defined as:
M
vlI = [vlI (1), . . . , vlI (D)],
(4)
te
d
where D is the number of HoG directions estimated for every single HoG cell and ( 1 if |hlm (d) − hlI (d)| < hlmin (d) or |hlm (d) − hlI (d)| > hlmax (d) l vI (d) = , 0 otherwise
(3)
Ac ce p
where d = 1, 2, . . . , D, hlI (d) is the HoG feature calculated for the analysed US frame I, image cell l and direction d, hlm (d) is the HoG feature estimated for the average image and direction d, hlmin (d), and hlmax (d) are HoG boundary parameters given by eq. (1) and eq. (2), respectively, estimated in the d direction. The detection area consists of these HoG cells, whose 7 of 9 components of vector vlI are set to 1. This is the biggest object in the mask created by the analysed cells. Therefore the impact of small false positives is reduced. To reduce also the influence of false negative detections, the detection results of currently analysed image are compared with the outcomes of 4 neighbouring images. The final detection area consists of these HoG cells, whose 3 out of 5 cells in consecutive images are assigned into the deformation region. The exemplary detection results compared with manual needle selection are shown in Figure 2. 2.3. Path 1 The detection step results in a region of interest, containing the inserted needle. The above given example shows its effectiveness in case of a partially or even invisible needle. 6
Page 6 of 25
ip t cr
us
Figure 3: The estimated trajectory. The dashed line is the needle trajectory estimated for a single US image, whereas the blue arrow is the final result - trajectory obtained for the exemplary US recording.
an
For such data, the last part of the analysis is limited to the needle trajectory estimation (Figure 1).
d
M
2.3.1. Trajectory Estimation At this step we assume that the inserted part of the needle is rigid. Moreover, the stiffness of the tissue makes it hardly possible to change the needle trajectory during the biopsy procedure. Based on this assumption, the final needle trajectory is estimated as a result of all trajectories found in US images. The trajectory for a single US image is determined by the angle between the x-axis and the major axis of the ellipse that has got the same second-moment as the detected needle area (see Figure 3).
Ac ce p
te
2.4. Path 2 Path 2 is followed when the biopsy needle fully or partially appears in the B-mode US image. This procedure results in a more accurate trajectory assessment yielding also the detection and tracking of the needle tip. 2.4.1. Image Pre-processing To reduce the artefacts and noise the pre-processing step applying a shock filter [23] is introduced. It is based on the sharpening formula by Osher and Rudin [24] applied to a blurred image u0 ∂u = −sign(∆uG )|OuG |, ∂t
(5)
where OuG =
∂uG ∂uG ıˆ + ˆ ∂x ∂y
(6)
7
Page 7 of 25
and uG is the convolution of image u0 with Gaussian filter. The iterative filtering procedure, where i denotes the current iteration, can be then summarized as uG(i) = uG(i−1) − 0.6sign(∆uG(i−1) )|OuG(i−1) |.
(7)
us
cr
ip t
2.4.2. Trajectory Calculation The trajectory detection is performed by a filter cascade. First, the Canny edge detector [25] is introduced. Thanks to an explicit voting procedure over a set of parametrized image objects the preselected edge points resulting from the Canny detector are grouped into object candidates [26]. In the parameter space the sought line is represented by the Hesse normal form [5] r = x cos Θ + y sin Θ,
(8)
te
y = ax + b + c,
d
M
an
where r is its algebraic distance from the origin and Θ = [0, π] is the angle of its normal. Therefore, every line in the original plane corresponds to a unique point in the parameter space and the curves corresponding to co-linear figure points have a common point of intersection. Since the needle appears as a long object, all regions of interest (see Section 2.2) are subjected to the Hough transform [5]. The longest line in each ROI defines the needle trajectory. False positives appearing in the US images acquired at the beginning of the needle insertion procedure are reduced by the analysis of subsequent scans. Finally, the needle trajectory area is described as (9)
Ac ce p
where the a and b are the parameters of the estimated trajectory and the constant c = ±2 defines the margin for further analysis. The exemplary needle trajectory area resulting from this step is shown in Figure 4a. 2.4.3. Needle Tip Detection The inserted needle appears as a linear object, of specified thickness and angle, usually brighter than its background. However, due to the low US image quality and low signalto-noise ratio, a robust line detector is required. In comparison to other techniques like Sobel, Canny edge detectors or Laplacian, the Gabor filters are described as featuring optimal localization properties in both spatial and frequency domain. Due to its attractive capabilities in needle detection described in [2, 9], the Gabor filtering procedure followed by eigenvalue analysis is applied. A discriminative model of the needle is constructed on
8
Page 8 of 25
ip t cr us
an
Figure 4: a) The needle trajectory area, b) The corresponding Gabor filter mask and Gabor filtering results
M
the basis of filter response in the direction estimated in previous steps. The Gabor filter is defined as [27] (10)
d
2 Θ x Θ2y 2πΘ x gλ,γ,Θ,σ,a,b (x, y) = exp −0.5 2 + 2 cos , σ x σy λ
te
where Θ x = x cos Θ + y sin Θ, Θy = −xqsin Θ + y cos Θ, λ π σx γ
ln(2) 2b +1 , 2 2b −1
(11)
Ac ce p
σx = σy =
λ is a wavelength of the cosine factor of the Gabor filter kernel, b is spatial frequency bandwidth, Θ = arctan(a) − π2 is the orientation of the normal to the parallel stripes of a Gabor function calculated using the Hough transform results (a in equation 9), and γ is the spatial aspect ratio of the filter. The Gabor filter kernel with the corresponding filtering result is shown in Figure 4b. Since the needle tip is to be tracked, the filter response is proceeded by weighted fuzzy c-means clustering [21]. Subsequently, the minimum eigenvalue algorithm [18] is incorporated to describe the tracking point. It is based on the Harris corner detector [28], that improves the detection ability by finding appropriate selection criteria. The Harris
9
Page 9 of 25
corner technique is based on the score value R calculated for each image pixel using two eigenvalues λ1 and λ2 of the analysed region: R = λ1 λ2 − k(λ1 + λ2 )2 ,
(12)
ip t
where k being the constant value makes it possible to manipulate the score. In the ShiTomasi modification the score value R is given as R = min(λ1 , λ2 ).
cr
(13)
us
In both techniques the point is marked as a corner if the R value is greater than a certain predefined value.
te
d
M
an
2.4.4. Needle Tip Tracking The needle tip tracking step focuses on the single point tracking technique. Considering future implementation of the developed methodology in real-time application, the low computational cost optical flow technique is needed. The Lucas-Kanade [29] approach benefits by implementing the concept of least square method for solving the optical flow equation, being the same faster than e.g. Horn-Schunck implementation [30]. The finally chosen Kanade-Lucas-Tomasi (KLT) algorithm [17] additionally introduces the Harris corner features to guarantee small error sensitivity and briefly can be formulated as follows [31]. Let I and J be the two consecutive images of the analysed video and u = [u x , uy ]T be a point of I. We are searching for v = [v x , vy ]T belonging to J and being similar to u (14)
(I(x, y) − J(x + d x , y + dy ))2 .
(15)
v = u + d = [u x + d x , uy + dy ]T ,
Ac ce p
where d = [d x , dy ]T represents the optical flow at point u. The problem of determining the motion parameters in the window of size (2w x + 1) × (2wy + 1) is to find d that minimizes the dissimilarity (d) =
uX x +w x
uX y +wy
x=u x −w x y=uy −wy
To avoid the problem of missing the tracked point in case of big displacement, the pyramid representation of the KLT algorithm in applied. For the pyramidal notation the error function at the level L of the pyramid can be formulated as L (dL ) =
L +w uX x x
uyL +wy
x=uLx −w x
y=uyL −wy
X
(I(x, y)L − J L (x + gLx + d xL , y + gyL + dyL ))2 ,
(16)
10
Page 10 of 25
ip t cr us an
M
Figure 5: The exemplary tracking results for a porcine phantom. The subsequent images are the selected US scans recorded during needle insertion.
Ac ce p
te
d
where gL = [gLx , gyL ]T is the initial guess at level L used to pre-transform an image I in J and the initial guess at the deepest level is equal to gLm = [0, 0]T . The goal is then to find the vector dL minimizing the value L . The detail information concerning the optimization problem can be found in [17] or [31]. In the developed flow chart of the needle tip tracking technique the pyramidal KLT algorithm combined with minimum eigenvalue based feature extraction method is used. This combination leads to the tracking results shown in Figures 5 and 6. 3. Stereoscopic Navigation System As a verification tool, stereoscopic navigation system Polaris Vicra is employed. The system is capable to compute positions of dedicated markers with very high accuracy (less than 0.2 mm), and a frame rate up to 60 Hz. Single position of the marker in the system global coordinate frame G is given as a transformation matrix M, that defines both position (translation) and orientation (rotation) of the marker (Figure 7). Similarly, global position of any point, whose location relatively to the marker’s local coordinates frame L is given, is computed by multiplication the marker transformation matrix and local position of the
11
Page 11 of 25
ip t cr us an M d te Ac ce p
Figure 6: The exemplary tracking results for a jelly phantom. The subsequent images are the selected US scans recorded during needle insertion (top) and removal (bottom).
12
Page 12 of 25
G
U
L
an
T
MT
us
L
ML
cr
G
MU
ip t
G
point:
Ac ce p
xG xL y G = M G yL . zG L zL 1 1
te
d
M
Figure 7: Navigation system coordinates frames (G – global, L – local marker on the biopsy gun, U – G US probe, T – the tip of the needle), and their interrelations (red arrows). MLG and MU indicate the transformations between global coordinates frame and the markers L and U, respectively. MTL indicates the transformations between the needle tip and its marker.
(17)
The positions are presented as homogeneous coordinates. Local position (i.e. position relative to marker local coordinate frame) of the point is computed during the calibration procedure [32]. In order to calibrate the tip of the biopsy needle, one can obtain its local position (MTL in Figure 7) by pivoting the whole needle around the tip fixed to the ground. During this process, a position of the tip is invariant to G and L coordinates frames. This assumption allows the needle calibration to be completed. The overdetermined equation system [33] is solved as well after acquiring several dozens of the marker positions. To calibrate the probe, i.e. to obtain transformation between the US image and the marker attached to the probe, N-wire phantom is used as described in [34]. Accuracy of 13
Page 13 of 25
cr
ip t
probe calibration affects final results, therefore this procedure ought to be performed as precisely as possible. Our calibration error does not exceed 2 mm. Accuracy of stereoscopic camera is highly precise as long as the markers are fixed to rigid bodies. If any deformations occur, significant errors might appear [35]. Core biopsy instrument (so called “biopsy gun”) consists of the handle and the needle, which are not connected very stiffly, yet the needle deviates within a certain range (red cone in Figure 8). Especially, bending of the needle increases if the instrument is inserted into thick tissue when the physician press the handle heavily.
M
an
us
3.1. Core biopsy instrument calibration Because of the needle base flexibility, calibration of the biopsy gun described above cannot be perform properly, as the position of the tip to the marker is not invariant. We modified the process by adding an alternative, rigid stylus tool, with a marker attached to it. The tip of the stylus is placed exactly on the needle tip. Then, a circular movement of the stylus is executed and the transformation matrices between both marker coordinate frames are acquired. This allows the needle tip position relative to the marker attached to the handle to be found. 4. Results
Ac ce p
te
d
The experimental set-up is shown in Figure 9. The US images have been acquired at resolution [288 × 352] pix and spacing 0.13 × 0.13 mm. The inserted needle length reached 65 mm. Both described paths have been evaluated by 364 and 724 images, respectively, from 12 biopsy procedure series, including ca. 90 frames per biopsy. The algorithm is implemented in MATLAB software, release R2014b. The evaluation has been done on a PC with Intel Core i7-2600 CPU, 16GB on MS Windows 7 x64 operating system and the statistical analysis has been performed using "R" Software. 4.1. Evaluation The obtained detection and trajectory estimation results (path 1 in Figure 1) were compared with the manual needle delineations performed by three experts (doctors with long-term experience in US image analysis) on each of 364 US scans. The trajectories of the needle delineated by the experts were estimated according to the algorithm described in Section 2.3.1. The maximal errors obtained in the inter-observer experiments are presented in Table 1. The mean relative error (calculated for all pairs of experts) was equal to 7.5%, whereas the mean absolute error was equal to 3.3 deg. The maximal errors of the automated trajectory estimation algorithm calculated in relation to all the experts are given in Table 2. The 14
Page 14 of 25
ip t cr us an M Ac ce p
te
d
Figure 8: 3D visualisation of biopsy procedure. Models positions are obtained from the navigation system. Red spherical sector indicates a possible needle tip location.
Figure 9: Experimental set-up: turkey phantom, US probe and the biopsy gun with the Polaris markers attached to them.
15
Page 15 of 25
mean relative error of the automated trajectory estimation algorithm calculated in relation to all the experts was equal to 11.6% and the mean absolute error was 5.3 deg. Table 1: The maximal errors of trajectory estimation in inter-observer experiments
absolute error (in degrees)
Expert 1 vs Expert 2 Expert 1 vs Expert 3 Expert 2 vs Expert 3
6.8% 4.2% 11.4%
3 2 5
Mean
7.5%
cr
ip t
relative error
us
3.3
Ac ce p
te
d
M
an
The validation procedure of tracking technique (path 2 in Figure 1) was performed using the stereoscopic navigation system. Due to the limitations of the stereoscopic tracking applied to biopsy procedure, described in Section 3, the prepared experiments followed the scenario. The trajectory of visible needle was calculated for all of the analysed 724 US images according to the algorithm described in Section 2.4.2. For the same set of data, a sphere with the centre located at the needle origin and radius equal to the needle’s length was defined (red cone in Figure 8). The ground truth for the needle tip tracking step was the intersection of the calculated needle trajectory and the sphere (Figure 10). The accuracy of the tracking step was calculated for each US image as the distance between the previously obtained point of intersection and point tracked in the image. Initial and trailing US frames just before and at the end of biopsy procedure were rejected from the analysis. We compared these distances (denoted as “Dist. 1” in Table 3) with the results yielded by Polaris navigation system if biopsy gun was treated as rigid body (“Dist. 2” in Table 3). For meat phantoms, the t-test showed statistically significant difference between the distances – p < 104 , which proved that the distances obtained with ground truth method were greater then the Dist. 2. The same results were obtained, if the test had been performed on Table 2: The maximal errors of the automated trajectory estimation algorithm calculated in relation to all the experts
relative error
absolute error (in degrees)
Expert 1 vs automated method Expert 2 vs automated method Expert 3 vs automated method
10.6% 18.2% 6.1%
5 8 3
Mean
11.6%
5.3
16
Page 16 of 25
cr
ip t
jelly phantoms. Since none of the analysed datasets did not follow normal distribution (Shapiro-Wilk test, in all cases p < 104 ), the Ansari-Bradley test has been employed. Equal dispersion of the data has been confirmed (p = 0.07635 for meat phantom and p = 0.3698 for jelly phantom). For both phantoms, due to the Ansari-Bradley test results, the U Mann-Whitney tests were performed, which have proven the statistically significant differences between the distances Dist. 1 and Dist. 2, in both cases favouring the former. Table 3: Distributions of distances between the tracked point and ground truth (Dist. 1), and between the tracked point and the tip positions yielded by navigation system (Dist. 2)
Dist. 2
Dist. 1
Dist. 2
0.031 2.932 4.100 4.384 5.386 9.122
4.334 5.642 8.958 9.415 13.983 15.153
0.09313 1.67412 2.60997 2.79551 3.72879 11.11575
1.099 2.398 3.580 3.888 4.400 14.910
M
an
us
Dist.1
te
5. Discussion
Jelly phantoms
d
Min. 1st Qu. Median Mean 3rd Qu. Max.
Meat phantoms
Ac ce p
Due to the fact, that the biopsy needle tracking is an important issue in real-time applications the problem of computational complexity of the algorithm should also be tackled. It can be divided into three separate parts featuring various problems. The most complex part common for both processing paths is the HoG feature extraction. As reported in [36] this robust and strong features achieving good accuracy in shape description are a reasonable trade-off between detection efficiency and complexity compared to alternative richer features, like SIFT (Scale-Invariant Feature Transform) features or SURF (Speeded-Up Robust Features). In the literature [36, 37] the problem of HoG computational complexity in real-time systems is solved by implementing it in CPU, GPU and finally multiply-and-accumulate (MAC) units. The second important algorithm being the preprocessing part of needle tip tracking technique is Gabor Filter followed by the WFCM method. The computational complexity of filtering procedure is O(M 2 N 2 ), where M and N denote sizes of the filter and the image, respectively. As reported in [38] it can be reduced to O(MN 2 ) by using separate 1-dimensional filters for rows and columns. The complexity of the FCM clustering 17
Page 17 of 25
ip t cr us an M
d
Figure 10: The sphere obtained from the navigation system – the red line, imprinted in exemplary US image
Ac ce p
te
procedure is O(N 2 ) and for the applied WFCM method it is reduced to O(N). Both the preprocessing algorithms are faster, if the analysis is reduced to the trajectory area (see Fig. 4). The last part of the tip tracking technique based on previously proceeded data is very efficient and the analysis of a single US frame lasts approx. 4.9 ms. Therefore, the proposed method is suitable to be applied in real-time applications. The obtained accuracy is similar to other present results, e.g. Marinetto et al. claimed accuracy of tracked US and CBCT registration equal to 2.1–3 mm [39]. Point reconstruction accuracy (PRA) of US image tracked by an optical stereoscopic device is equal to 1.0 ± 0.12 mm [40]. The developed technique outperforms the existing approaches by targeting in the analysis of both the high quality US images as well as US data with hardly visible needle. Moreover, it automatically detects the moment of needle insertion in the recorded US examination. The main limitation of our technique, whose solution is one of our plans for a future work, is the problem with an automatic determination of the US series type. Currently, after detecting the needle area the user assesses the level of the needle visibility and selects the appropriate path. Moreover, the needle tip tracking algorithm will be improved 18
Page 18 of 25
to handle a loss of the tracking point in a single image. It is now solved searching the nearest neighbouring point with preserving its velocity of motion.
ip t
6. Conclusion
Ac ce p
Acknowledgements
te
d
M
an
us
cr
The paper presents a fully automated algorithm of needle detection and tracking for computer assisted core biopsy procedure. The developed methodology takes into consideration images with partially and well visible needles, which are met in US practice. Based on this observation two separate scenarios, with common initial steps of the analysis, are proposed. The introduced evaluation step proves the efficiency of the developed technique and encourages to perform further clinical test. Although stereoscopic navigation system features very high precision and is used often as reference tracking systems [1], their usefulness is proven only if the markers are attached to rigid body tools. If a shift occurs between the marker and any point, navigation system fails, unless there are further improvements. Development of the needle tip tracking procedure requires additional assumption (invariant needle angle, restricted flexibility of the needle). The obtained accuracy extended from 3 to 5 mm. This is a satisfying result, provided that US image is navigated with 2 mm accuracy. Significantly higher errors are obtained if the navigation system is not improved by the tracking algorithm. Therefore, the stereoscopic navigation system cannot act alone as a reference tracking tool for core biopsy procedures. The software implementing algorithms described in this paper is employed as a part of breast cancer biopsy navigation system.
This research was supported by the Polish National Centre for Research and Development (Narodowe Centrum Badan i Rozwoju) grant No. STRATEGMED2/267398/4/ NCBR/2015. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. The research was approved by the ethics committee of The Maria Skłodowska Curie Memorial Cancer Centre and Institute of Oncology in Warsaw (Poland). Conflict of Interest Statement
The authors certify that they have NO affiliations with or involvement in any organization or entity with any financial interest (such as honoraria; educational grants; participation in speakers’ bureaus; membership, employment, consultancies, stock ownership, 19
Page 19 of 25
or other equity interest; and expert testimony or patent-licensing arrangements), or nonfinancial interest (such as personal or professional relationships, affiliations, knowledge or beliefs) in the subject matter or materials discussed in this manuscript.
ip t
References
cr
[1] V. Harish, T. Ungi, A. Lasso, A. MacDonald, S. Nanji, G. Fichtinger, Intraoperative visualization and assessment of electromagnetic tracking error, in: Proc. SPIE, Vol. 9415, 2015, pp. 94152H–94152H–6. doi:10.1117/12.2082330.
an
us
[2] M. Kaya, O. Bebek, Needle localization using gabor filtering in 2d ultrasound images, in: 2014 IEEE International Conference on Robotics and Automation (ICRA), Institute of Electrical and Electronics Engineers (IEEE), 2014, pp. 4881–4886. doi:10.1109/icra.2014.6907574.
M
[3] P. Beigi, R. Rohling, T. Salcudean, V. A. Lessoway, G. C. Ng, Detection of an invisible needle in ultrasound using a probabilistic {SVM} and time-domain features, Ultrasonics (2017) –doi:10.1016/j.ultras.2017.02.010.
te
d
[4] Z. Wei, L. Gardi, D. Downey, A. Fenster, Oblique needle segmentation for 3d TRUS-guided robot-aided transperineal prostate brachytherapy, in: 2nd IEEE International Symposium on Biomedical Imaging: Macro to Nano (IEEE Cat No. 04EX821), Institute of Electrical and Electronics Engineers (IEEE), 2004, pp. 960– 963. doi:10.1109/isbi.2004.1398699.
Ac ce p
[5] R. O. Duda, P. E. Hart, Use of the Hough transformation to detect lines and curves in pictures, Communications of the ACM 15 (1) (1972) 11–15. doi:10.1145/ 361237.361242. [6] M. Ding, A. Fenster, A real-time biopsy needle segmentation technique using hough transform, Medical Physics 30 (8) (2003) 2222–2233. doi:10.1118/1.1591192. [7] W. Qiu, M. Ding, M. Yuchi, Needle segmentation using 3d quick randomized Hough transform, in: 2008 First International Conference on Intelligent Networks and Intelligent Systems, Institute of Electrical and Electronics Engineers (IEEE), 2008, pp. 449–452. doi:10.1109/icinis.2008.41. [8] M. Uhercik, J. Kybic, H. Liebgott, C. Cachard, Model fitting using ransac for surgical tool localization in 3-d ultrasound images, IEEE Transactions on Biomedical Engineering 57 (8) (2010) 1907–1916. doi:10.1109/TBME.2010.2046416.
20
Page 20 of 25
ip t
[9] A. Pourtaherian, S. Zinger, P. H. N. de With, H. H. M. Korsten, N. Mihajlovic, Gaborbased needle detection and tracking in three-dimensional ultrasound data volumes, in: 2014 IEEE International Conference on Image Processing (ICIP), Institute of Electrical and Electronics Engineers (IEEE), 2014, pp. 3602–3606. doi:10.1109/ icip.2014.7025731.
cr
[10] Y. Zhao, H. Liebgott, C. Cachard, Tracking micro tool in a dynamic 3d ultrasound situation using kalman filter and ransac algorithm, in: 2012 9th IEEE International Symposium on Biomedical Imaging (ISBI), 2012, pp. 1076–1079. doi:10.1109/ ISBI.2012.6235745.
an
us
[11] P. Chatelain, A. Krupa, M. Marchal, Real-time needle detection and tracking using a visually servoed 3d ultrasound probe, in: 2013 IEEE International Conference on Robotics and Automation, 2013, pp. 1676–1681. doi:10.1109/ICRA.2013. 6630795.
M
[12] J. Czajkowska, B. Pycinski, E. Pietka, HoG feature based detection of tissue deformations in ultrasound data, in: 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Institute of Electrical & Electronics Engineers (IEEE), 2015, pp. 6326–6329. doi:10.1109/embc.2015. 7319839.
te
d
[13] J. Czajkowska, J. Juszczyk, B. Pycinski, E. Pietka, Biopsy needle and tissue deformations detection in elastography supported ultrasound, in: Advances in Intelligent Systems and Computing, Vol. 471, Springer, 2016, pp. 85–96. doi: 10.1007/978-3-319-39796-2_8.
Ac ce p
[14] P. F. Felzenszwalb, R. B. Girshick, D. McAllester, D. Ramanan, Object detection with discriminatively trained part-based models, IEEE Transactions on Pattern Analysis and Machine Intelligence 32 (9) (2010) 1627–1645. doi:10.1109/tpami. 2009.167. [15] C. Vondrick, A. Khosla, T. Malisiewicz, A. Torralba, HOGgles: Visualizing Object Detection Features, International Conference on Computer Vision (ICCV) (2013) 1–8. [16] S. Ghafurian, I. Hacihaliloglu, D. N. Metaxas, V. Tan, K. Li, 3D/2D image registration using weighted histogram of gradient directions, in: R. J. Webster, Z. R. Yaniv (Eds.), Medical Imaging 2015: Image-Guided Procedures, Robotic Interventions, and Modeling, Vol. 9415, SPIE-Intl Soc Optical Eng, 2015, pp. 94151Z–94151Z–7. doi:10.1117/12.2081316. 21
Page 21 of 25
[17] C. Tomasi, T. Kanade, Detection and tracking of point features, School of Computer Science, Carnegie Mellon Univ. Pittsburgh, 1991.
ip t
[18] J. Shi, C. Tomasi, Good features to track, in: 9th IEEE Conference on Computer Vision and Pattern Recognition, Cornell University, Ithaca, NY, USA, 1994, pp. 593– 600.
cr
[19] S. Sultan, G. Shorten, G. Iohom, Simulators for training in ultrasound guided procedures, Medical Ultrasonography 15 (2) (2013) 125–131. doi:10.11152/mu.2013. 2066.152.sfs1gs2.
us
[20] J. Ophir, I. Cespedes, B. Garra, H. Ponnekanti, Y. Huang, N. Maklad, Elastography: ultrasonic imaging of tissue strain and elastic modulus in vivo, European Journal of Ultrasound 3 (1) (1996) 49–70.
an
[21] P. Szwarc, J. Kawa, E. Pietka, White matter segmentation from mr images in subjects with brain tumours, in: Information Technologies in Biomedicine, Springer International Publishing, 2012, pp. 36–46. doi:10.1007/978-3-642-31196-3_4.
d
M
[22] N. Dalal, B. Triggs, Histograms of oriented gradients for human detection, in: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 05), Institute of Electrical and Electronics Engineers (IEEE), 2005, pp. 886– 893. doi:10.1109/cvpr.2005.177.
Ac ce p
te
[23] F. Guichard, J.-M. Morel, A note on two classical shock filters and their asymptotics, in: Scale-Space and Morphology in Computer Vision, Springer Nature, 2001, pp. 75–84. doi:10.1007/3-540-47778-0_7. [24] S. Osher, L. I. Rudin, Feature-oriented image enhancement using shock filters, SIAM Journal on Numerical Analysis 27 (4) (1990) 919–940. doi:10.1137/0727053. [25] J. Canny, A computational approach to edge detection, IEEE Transactions on Pattern Analysis and Machine Intelligence PAMI-8 (6) (1986) 679–698. doi:10.1109/ tpami.1986.4767851. [26] G. Stockman, L. G. Shapiro, Computer Vision, 1st Edition, Prentice Hall PTR, Upper Saddle River, NJ, USA, 2001. [27] N. Petkov, P. Kruizinga, Computational models of visual neurons specialised in the detection of periodic and aperiodic oriented visual stimuli: bar and grating cells, Biological Cybernetics 76 (2) (1997) 83–96. doi:10.1007/s004220050323. 22
Page 22 of 25
[28] C. Harris, M. Stephens, A combined corner and edge detector, in: In Proc. of Fourth Alvey Vision Conference, 1988, pp. 147–151.
cr
ip t
[29] B. D. Lucas, T. Kanade, An iterative image registration technique with an application to stereo vision, in: Proceedings of the 7th International Joint Conference on Artificial Intelligence - Volume 2, IJCAI’81, Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 1981, pp. 674–679. URL http://dl.acm.org/citation.cfm?id=1623264.1623280
us
[30] S. D. Thota, K. S. Vemulapalli, K. Chintalapati, P. S. S. Gudipudi, Comparison between the optical flow computational techniques, International Journal of Engineering Trends and Technology 4 (10).
an
[31] J.-Y. Bouguet, Pyramidal implementation of the Lucas Kanade feature tracker, Intel Corporation, Microprocessor Research Labs, 2000.
M
[32] B. Pycinski, J. Czajkowska, P. Badura, J. Juszczyk, E. Pietka, Time-of-flight camera, optical tracker and computed tomography in pairwise data registration, PLoS ONE 11 (7) (2016) e0159493. doi:10.1371/journal.pone.0159493.
te
d
[33] B. Pyci´nski, Estimation of pointer calibration error in optical tracking system, in: M. Gzik, E. Tkacz, Z. Paszenda, E. Pietka ˛ (Eds.), Innovations in Biomedical Engineering, Springer International Publishing, 2017, pp. 228–239. doi:10.1007/ 978-3-319-47154-9_27.
Ac ce p
[34] G. Carbajal, A. Lasso, A. Gómez, G. Fichtinger, Improving N-wire phantom-based freehand ultrasound calibration., Int J Comput Assist Radiol Surg 8 (6) (2013) 1063– 1072. doi:10.1007/s11548-013-0904-9. [35] J. Juszczyk, J. Czajkowska, B. Pycinski, E. Pietka, ToF-data-based modelling of skin surface deformation, in: Advances in Intelligent Systems and Computing, Vol. 472, Springer Nature, 2016, pp. 235–244. doi:10.1007/978-3-319-39904-1_21. [36] A. Suleiman, V. Sze, An energy-efficient hardware implementation of hog-based object detection at 1080hd 60 fps with multi-scale support, J. Signal Process. Syst. 84 (3) (2016) 325–337. doi:10.1007/s11265-015-1080-7. URL http://dx.doi.org/10.1007/s11265-015-1080-7 [37] S. A. L. A. Baghdadi, Enhancement of fast pedestrian detection using hog features, International Journal of Management and Applied Science (2015) 18–22.
23
Page 23 of 25
[38] J. Ilonen, J. K. Kamarainen, H. K. Kälviäinen, Efficient Computation of Gabor Features, Tech. rep., Lappeenranta University of Technology, Department of Information Technology (2005).
cr
ip t
[39] E. Marinetto, A. Uneri, T. D. Silva, S. Reaungamornrat, W. Zbijewski, A. Sisniega, S. Vogt, G. Kleinszig, J. Pascau, J. Siewerdsen, Integration of free-hand 3d ultrasound and mobile c-arm cone-beam CT: Feasibility and characterization for realtime guidance of needle insertion, Computerized Medical Imaging and Graphics 58 (2017) 13–22. doi:10.1016/j.compmedimag.2017.03.003. URL https://doi.org/10.1016/j.compmedimag.2017.03.003
Ac ce p
te
d
M
an
us
[40] A. Lasso, T. Heffter, A. Rankin, C. Pinter, T. Ungi, G. Fichtinger, PLUS: Opensource toolkit for ultrasound-guided intervention systems, IEEE Transactions on Biomedical Engineering 61 (10) (2014) 2527–2537. doi:10.1109/tbme.2014. 2322864. URL https://doi.org/10.1109/tbme.2014.2322864
24
Page 24 of 25
ip t cr us an M d
• •
The paper presents novel algorithm for biopsy needle tracking in ultrasound images. HoG feature extraction technique, Hough transform, Gabor filter and single point tracking algorithm are employed in the analysis. The performance is not restricted to the needle appearance in the image The evaluation is based on combining the stereoscopic navigation system with the tracking algorithm.
Ac ce pt e
• •
Page 25 of 25