Optics Communications 283 (2010) 3475–3480
Contents lists available at ScienceDirect
Optics Communications j o u r n a l h o m e p a g e : w w w. e l s ev i e r. c o m / l o c a t e / o p t c o m
Feature tracking for projection registration in laboratory-scale reflective tomography laser radar imaging Xiaofeng Jin ⁎, Jianfeng Sun, Yi Yan, Yu Zhou, Liren Liu CAS Key Laboratory of space Laser communication and Testing technology, Shanghai Institute of Optics and Fine Mechanics, P.O. Box 800-211 Shanghai 201800, China
a r t i c l e
i n f o
Article history: Received 10 November 2009 Received in revised form 28 April 2010 Accepted 29 April 2010 Keywords: Reflective tomography Laser radar imaging Feature tracking Projection registration
a b s t r a c t In range-resolved reflective tomography laser radar imaging system, for accurate image reconstruction, all projections must be aligned to the center of target rotation. Due to the unknown location of the axis of rotation and random target translation, the center of target rotation in each projection cannot be measured accurately and images reconstructed based on range-resolved projections without registration would be dislocated. Here, we apply a feature tracking method to align projections data for tomographic reconstruction in laboratory-scale laser radar imaging. The target was placed on a spin table with an unknown and fixed axis, and the oscillatory motion of the target translation was simulated by a second order inertia filter to accord with the real satellites vibrations. The experimental simulation results demonstrate the effectiveness of this method to improve images reconstruction equality. The application of this method to actual radar imaging in remote distance was discussed and the differences were compared with the lab setups. © 2010 Elsevier B.V. All rights reserved.
1. Introduction Space object identification of satellites or other interesting targets is a critical surveillance mission for many countries. Over the past years, laser sensing and imaging technologies have been rapidly developed due to their potential applications. Reflective tomography is one of the most effective high-resolution imaging methods. In reflective tomography, laser radar systems can be designed to provide remote sensing of range-resolved, Doppler-resolved and rangeDoppler-resolved signature. Direct detect range-resolved reflective tomography techniques can be used to obtain an image of a target that is angularly unresolved when the data are range resolved. These techniques are especially applicable for a target that is not rotating fast enough to have its Doppler spectrum resolved by the detector, and the projections data could be collected from incoherent or coherent system. Because the angular resolution in range-resolved reflective tomography is unavailable, multiple viewing directions are necessary for image reconstruction of the target cross section. The signal returned is reflected off the illuminated outer surface of an opaque target, only information about the exterior of the target can be obtained, and the images reconstructed using reflective tomography techniques is an outline view of the target cross section. The rangeresolved laser reflective tomography imaging laser radar was first introduced by researchers Parker et al. [1], Knight et al. [2] at the Massachusetts Institute of Technology. Several years later, Charles L.
⁎ Corresponding author. E-mail address:
[email protected] (X. Jin). 0030-4018/$ – see front matter © 2010 Elsevier B.V. All rights reserved. doi:10.1016/j.optcom.2010.04.096
Matson [3–6] in Air Force began exploring the technique of using the HI-CLASS coherent laser radar system to obtain reflective images. In the range-resolved reflective tomography, due to the unknown locations of the axis of rotation and random target translation, the range to the target cannot be measured accurately enough to align the intensity projections to an appropriate center of rotation. That would bring serious artifacts in the final images reconstructed by the misaligned projections. In order to overcome this problem, Ford and Matson [7] have proposed a phase retrieval iterative technique which involves Fourier transforming back and forth between object and Fourier domains, satisfying the constraints in one domain before returning to the other. That needs about100 iterative times and interpolation from polar to Cartesian, which bring a large amount of computation. Other constraints are necessary for application of this phase retrieval technique to actual laser radar imaging. In this paper, we introduce a method using feature tracking to determine center of rotation in each projection. The relative stable motion of targets in adjacent views was used and the sinusoidal traces in projections of the features on the target surface were traced. This method could improve image reconstruction quality without iteration and interpolation. Compared with the lab data, the applications of this method to the actual radar imaging need different setups, including pulse energy, transmitting and receiving apertures and so on. When the targets were placed in different diffraction regions, corresponding wave fronts would produce various resolutions in projections received by the detector. The details were provided in Section 5. The rest of this paper is organized as follows. In Section 2, we present a brief review of the reflective tomography. The feature tracking method for projection registration including feature extraction and tracking is
3476
X. Jin et al. / Optics Communications 283 (2010) 3475–3480
contained in Section 3. The experimental simulation was described and the imaging results using feature tracking method were shown in Section 4. The application to the actual radar and differences compared with the lab data were provided in Section 5. A conclusion follows in Section 6. 2. Review of reflective tomography Firstly, a brief of filtered back-projection [8,9], a standard method of tomography, is provided for reflective tomographic reconstruction. Let f(x, y) denote the image to be reconstructed in reflective tomography, and Lr,ϕ denote the solid line cr = x cos ϕ + y sin ϕ, see Fig. 1 pðr; ϕÞ = ∫
Lr;ϕ
f ðx; yÞds;
ð1Þ
where s represents the solid line along Lr,ϕ, p(r, ϕ) is the projection of the target f(x, y) at angle ϕ, and the variable r denotes the spatial variable along the integration path in the ϕ direction. Using back-projection, the estimated image, g(x, y), is given by m
g ðx; yÞ = ∑ pð x cosϕi + y sin ϕi ; ϕi ÞΔϕ; i=1
ð2Þ
where ϕi is the angle of the ith projection, Δϕ is the sampling angular separation, and m is the total number of projections. Filtered backprojection was introduced to modify projections to reduce star-like artifacts, and then the reconstructed image gFB(x, y), is given by m
−1
gFB ðx; yÞ = ∑ F1 i=1
˜ ðpðr; ϕÞÞ Δϕ; cF 1
ð3Þ
1 denote the 1-dimensional Fourier transform and where F1, F− 1 inverse Fourier transform operators, r represents space variable, c̃ is the filter function, which is the product of a window function and the magnitude of the spatial frequency [2,10,12,14]
c˜ = jkjwðkÞ;
ð4Þ
images results from projection data. For the experimental data in this paper, we use filtered back-projection exclusively. 3. Projection registration In this part, the standard of feature extraction was firstly described in Section 3.1, and the mathematical basis of projection registration using feature tracking method was introduced in Section 3.2.
3.1. feature extraction In reflective tomography, the feature part should be tracked easily and reliably from one projection to another. So the precondition of feature tracking method is that some surfaces of the target have obviously different reflective coefficient, compared with other surfaces. Valleys and peaks are corresponding projections of these weaker or stronger scatter regions with different reflective coefficients, see Fig. 2. In this view, valley or peak in projections could be chosen as features, however not all valleys or peaks are appropriate. The selection standard includes: (a) full width at half maximum (FWHM) of the feature valleys or peaks should be limited to one or two sampling separations, slowly varying valleys and peaks are not appropriate; (b) the valleys or peaks should be kept in projections during adjacent rotational angles to compute the corresponding diameter R and angle θth; (c) the valleys (peaks) values should be obviously lower (higher) than the adjacent values in projections, relative small valleys and peaks compared with noises, are not appropriate. It is worth noting that the target is opaque, projection of one feature could only be measured in maximum angular extent of 180°, in theory, at least two features are necessary for accurate image reconstruction over full angles. Sometimes random noises might be extracted as features by error, and this probability could be decreased by averaging many times measured projections data in the same angle.
3.2. Feature tracking method
where k represents the frequency variable, w(k) is a window function. Other tomographic reconstruction algorithms, such as iterative algorithm [2,11], interpolation [13] can be used to produce similar
From Fig. 2, define feature A as the stronger or weaker scatter region with obviously different reflective coefficient. No matter what motion state of the target, the relative distance R between feature A on target surface and center of rotation is constant, see Fig. 3. Dashed round is the relative trace of feature A to the center when the target rotates one cycle, θi is the corresponding sampling angle and backprojection angle.
Fig. 1. Reflective tomography schematic diagram.
Fig. 2. Feature extraction (peak or valley) schematic diagram.
X. Jin et al. / Optics Communications 283 (2010) 3475–3480
3477
time delay Δt1, Δt2, Δφ is adjacent angle variable, R is the distance between feature A and center of target rotation. See Fig. 5, ΔL2 Δφ = 2R sin ; sinα2 2
ð6Þ
ΔL1 Δφ = 2R sin ; cosα1 2
ð7Þ
ΔL1 =
cΔt1 ; 2
ð8Þ
ΔL2 =
cΔt2 ; 2
ð9Þ
β2 =
ð10Þ
β1 = 90B−β2 ;
Fig. 3. The diameter R and distance delay ΔSi schematic diagram.
α1 = From Fig. 3, we can see that, ΔSi = R sin ðθi Þ:
180B−Δφ −α2 ; 2
ð11Þ
180B−Δφ −β1 ; 2
ð12Þ
α3 = 90B−α1 ; ð5Þ
The corresponding distances delay ΔSi in projections between feature A and the center of rotation shows sinusoid or cosine variation [8]. It depends on the definition of initial angle θi and direction of incident radiation. In this paper, sinusoid is used exclusively. Fig. 4 shows the corresponding sinusoid trace of feature in projections when peak is chosen as a feature. Because of the inertia of motion, the target can be supposed to be in relative stable state in adjacent angles, diameter R and back-projection angle θth of feature A could be inferred from projections data based on feature projections. Concrete processes are as follows. Suppose Δt1, Δt2 are successive time delays of feature A in reflective projections with adjacent angles φ1, φ2, φ3, while φ2 = φ1 + Δφ, φ3 = φ1 + 2Δφ. ΔL1, ΔL2 are distance delays of corresponding
θth = α4 =
ð13Þ
180−Δφ −α3 : 2
ð14Þ
From Eqs. (6)–(14) tan α2 =
Δt2 sinΔφ Δt1 −Δt2 cosΔφ:
ð15Þ
From Eqs. (6) and (15), R and the corresponding back-projection angle θth are given by R=
cΔt2 ; Δt2 sinΔφ 4 sinðΔφ = 2Þ sin arctan Δt −Δt cosΔφ
Fig. 4. Projection traces of feature A in reflective projections.
1
2
ð16Þ
3478
X. Jin et al. / Optics Communications 283 (2010) 3475–3480
Fig. 5. Feature tracking to determine diameter R and angle θth. (a) Global graph (b) adjacent amplification graph.
θth =
π Δt2 sinΔφ 3Δφ − − arctan : 2 Δt1 −Δt2 cosΔφ 2
ð17Þ
Based on the adjacent angles φ1, φ2, φ3 and angle θth, all angles φi whose reflective projections contain feature A should be modified as follows θi =
φi −φ1 + θth φi −φ3 + θth
ΔL1 N ΔL2 ; i = 1; 2⋯ ⋯N: ΔL1 b ΔL2
ð18Þ
After obtaining different values of R for the same feature in different adjacent angles, the variance D of R was used as feedback signal to adjust the feature A offsets in corresponding projections, because of the stability of R for the same feature A, 2 2 D = E R −ðEðRÞÞ :
ð19Þ
Where sign E denotes expectation value, in the premise of comparative small of variance of R, corresponding distances delay ΔSi and time delay Δτi between feature A and center of target rotation in projections can be given by, ΔSi = R sin ðθi Þ =
cΔt2 sin θi ; ð20Þ Δt2 sinΔφ 4 sinðΔφ = 2Þ sin arctan Δt −Δt cosΔφ 1
Δτi =
2ΔSi 1 = c 2
2
Δt2 sin θi : Δt2 sinΔφ sinðΔφ = 2Þ sin arctan Δt −Δt cosΔφ 1
ð21Þ
2
The pre and post sequences of feature A and center of rotation in projections could be determined by the sign of distance delay ΔSi. See Fig. 3, when distance ΔSi is positive, the center of rotation is distance |ΔSi| behind feature A in projections, otherwise, the situation is opposite. Feature A corresponds to the peak or valley in projections, which can be identified easily. Using the time delay calculated value Δτi between feature A and center of target rotation in each projection, and we can align projections which contain feature A to the center.
from a Q-switch laser [15]. For the convenience of collecting projection data, the repetition rate of laser was set to 1 HZ during the experiment. For the real world application, the pulse repletion rates should be modified based on the angular rate of the target and the sampling angular number. The return energy was distributed in time proportional to the range extent of the cone. The monostatic waveform was detected and recorded every 50 ps by a Tektronix series TDS7000B communications signal analyzers for a total time of 25 ns, and the corresponding distance resolution of system was 7.5 mm due to reflection off of the target. An adjustable FC collimation was used to couple returned radiation into fiber and defocus operation guaranteed that the returned radiation could cover all target reflective data. The transmitting and receiving apertures are almost mounted so that the transmitting and receiving paths share the same optical path. This assures that target surface points illuminated by the laser are always in the field of view of the optical receiver. The distance between the detector and the target was set 9.1 m due to the space constraint in the laboratory scale. Target was a cone (60-cm axial height ×20 cm base diameter) with two stronger scatter regions A, B on surfaces. The cross section of reference screen was constant during the experiment and was used to correct for the pulse-to-pulse laser amplitude fluctuations. The target was rotated about an unknown and fixed axis perpendicular to the cone body axis and the direction of incident radiation. The projections were sampled in 5° steps form 0° to 360°. A random amount translation was simulated for each aligned projection. In the actual Doppler-time-intensity (DTI) detection, the relative motion of the target manifests itself as an oscillatory curve [16]. We suppose a slow oscillatory motion in range-resolved adjacent views, the high frequency components of the random numbers were
4. Laboratory-scale simulation The experimental setup used to measure range-resolved data is shown in Fig. 6. A cone was illuminated by an 8 ps 1064 nm pulse
Fig. 6. Experimental apparatus used to measure range-resolved projections of a cone.
X. Jin et al. / Optics Communications 283 (2010) 3475–3480
3479
not appropriate for translation because of the continuity of target motion, so a second order inertial filter was followed to adjust all random amounts to accord with real satellites vibrations. The process was shown in Fig. 7. Finally, the standard projections were translated
Fig. 8. Misaligned projections.
by moving each projection away from the center of alignment by the filtered random amount (1–15 pixels), see Fig. 8. We can determine the diameter RA and angle θAth for feature A from adjacent projections data using feature tracking method in Section 3. All projections which contain feature A could be aligned to center of rotation. Repeat the process of feature A for B, then all projections would be aligned to the center. Due to the extent of the target, sampling angle separation or large variance D, there were still pixels misalignments in projections after registration. Fig. 9 shows the projections alignment after center registration and the center was at 230 bins (1 bin = 7.5 mm). Using filtered back-projection, the results of images reconstruction without and with registration are shown in Fig. 10. Though there were still pixels misalignment after registration, the cross section of cone reconstructed was clearly recognized from registered projections data. The discontinuous parts of the target outline in the image reconstruction were due to the relative weaker reflective regions, which were overwhelmed by the other surfaces in the multiple views back-projection process. The streak artifacts around the target were probably introduced by the coarse range and angular resolution. We threshold the image to minimize the artifacts results, as can be seen in Fig. 10(d). The size of the image reconstruction results matched well with the real target. We also collected data on another group of random translations for projections to validate the adaptability of this tracking method, and the final images reconstruction keep consistent with the former result.
5. Application to the actual radar Compared with the actual radar data, the experimental simulation in Section 4 was a proportion space reduced model. The applicability of this tracking method depends on the resolution of data that is collected by the detection system. Different range resolutions affect the detection and location of the features in the projections. When the targets are placed in different diffraction regions, corresponding wavefront curvature will produce various resolutions in projections received by the detector. In the above experimental simulation, because of the limited space constraints, the spherical wavefront was used. For the satellite or airplane imaging system in the actual radar data, the parallel wavefront might be considered due to the remote distance. Also the diffraction limit varies with the different ranges between the target and the detector, in order to attain the same effect, the actual transmitting and receiving apertures should be carefully computed. In that case the FC collimation with small aperture is not applicable, a telescope system with large effective numerical
Fig. 7. (a) Random number, (b) a second order inertial filter, (c) final random numbers after using the inertial filter for projections.
Fig. 9. Projections registration using feature tracking method.
3480
X. Jin et al. / Optics Communications 283 (2010) 3475–3480
lence. Besides, the tomographic angles are known in the experimental simulation, and views of the object are available for a full 360°. But in the real application, not all angles can be detected, and the limited number of returns would introduce artifacts in the image reconstruction. However, these artifacts come from the original data, and will exist in any projection registration method. 6. Summary and conclusions
Fig. 10. Images reconstructed from 72 projections. (a) and (b) images reconstruction with misaligned projections, (c) image reconstruction with projection registration using feature tracking method, (d) image (c) after threshold processing.
apertures (NAs) should be considered. Based on diffraction limitation theory, the smallest space resolution Δd of the receiver system, Δd =
1:22λ L D
ð22Þ
where D is the receiver effective NAs, L denotes the distance between the target and the receiver, λ is the laser length. Suppose β as the proportion of the actual target to the experimental target, for the proportional space resolution compared with the size of the target, 1:22λ 1:22λ L =β L : D1 1 D2 2
ð23Þ
With the actual effective NAD1, experimental optimal effective NAD2, experimental distance L2 and proportion β, the corresponding actual distance that matches with the laboratory-scale distance detection can be obtained, L1 = β
D1 L : D2 2
ð24Þ
Suppose that 60-cm circular aperture of the telescope was used as the actual receiver, the target scale in actual detection was 15 m, compared with the laboratory data, this tracking method would attend the proximate results when the target was placed at the actual distance 18 km away. Besides, for the remote distance detection, the transmitted pulse energy would be attenuated through atmosphere transmission, so the pulse energy must be considered to produce observable returns for hundreds of kilometers away. Long wave infrared can be used as the atmosphere window to reduce the affection by atmospheric turbu-
In conclusion, we demonstrated projection registration in rangeresolved reflective tomography image reconstruction using a feature tracking method to solve the issues about the unknown location of the axis of rotation and random target translation. In the limited distance 9.1 m and the range resolution 7.5 mm in projection, this method was shown to work well on simulated misaligned projections data to align all projections to the center of rotation, and the equality of image reconstruction using filtered back-projection algorithm was effectively improved. Because the target is opaque, several features are necessary for accurate image reconstruction over full angles extent in actual radar data. We have shown that this method is promising in the laboratoryscale and the differences are discussed between lab data with the actual application. However, more research needs to be accomplished to ensure that it will work for reflective projections with real world uncertainties. We are in the process of collecting data on airplanes and the feature tracking method will be used. Future research also includes the development of projection registration based on feature tracking method for targets with more complicated structure. Acknowledgements This work was supported by the National Natural Science Foundation of China under Grant No. 0803371-X00 and the Shanghai Natural Science Foundation of China under Grant No. 09ZR1435300. References [1] J.K. Parker, E.B. Craig, D.I. Klick, S.R. Kulkarni, R.M. Marino, J.R. Senning, B.K. Tussey, Applied Optics 27 (1988) 2642. [2] F.K. Knight, David Klick, D.P. Ryan-Howard, J.R. Theriault, B.K. Tussey, A.M. Beckman, Applied Optics 28 (1989) 2196. [3] E.P. Magee, C.L. Matson, D.H. Stone, SPIE Vol.2302 Image Reconstruction and Restoration, 1994, p. 95. [4] C.L. Matson, E.P. Magee, D.E. Holland, Optical Engineering 34 (1995) 2811. [5] J.B. Lasche, C.L. Matson, S.D. Ford, W.L. Thweatt, K.B. Rowland, V.N. Benham, SPIE Vol. 3815 (1999) 178. [6] C.L. Matson, D.E. Mosley, Applied Optics 40 (2001) 2290. [7] S.D. Ford, C.L. Matson, Part of the SPIE Conference on Digital image Recovery and Synthesis, vol. IV, 1999, p. 189. [8] R.M. Marino, R.N. Capes, W.E. Keicher, S.R. Kulkarni, J.K. Parker, L.W. Swezey, SPIE Vol.999 Laser Radar, vol. III, 1988, p. 248. [9] A.C. Kak, M. Slaney, Principles of Computerized Tomographic Imaging, IEEE Press, New York, 1987 pp. 297–322. [10] G.T. Herman, Image reconstruction from projections, Real-Time imaging 1 (1995) 3. [11] Yogadhish Das, W.M. Boerner, IEEE transaction on Antennas and propagation, vol. AP-26, 1978, p. 274. [12] Zhang Dong, Dong Xiufen, Acta Acoustica 20 (1995) 271. [13] Henry Stark, J.W. Woods, Indraneel Paul, IEEE transactions on acoustics and signal processing, vol. Assp-29, 1981, p. 237. [14] Y.S. Kwoh, I.S. Reed, T.K. Truong, IEEE Transactions on Nuclear Science, NS-24, 1977, p. 1990. [15] Jianfeng Sun, Xiaofeng Jin, Liren Liu, Proceedings of SPIE, 7419, 2009. [16] K.I. Schultz, S. Fisher, Ground-based laser radar measurements of satellite vibrations, Applied Optics 31 (1992) 7690.