Displacement measurement for color images by joint transform correlator

Displacement measurement for color images by joint transform correlator

Optik 124 (2013) 162–165 Contents lists available at SciVerse ScienceDirect Optik journal homepage: www.elsevier.de/ijleo Displacement measurement ...

798KB Sizes 2 Downloads 170 Views

Optik 124 (2013) 162–165

Contents lists available at SciVerse ScienceDirect

Optik journal homepage: www.elsevier.de/ijleo

Displacement measurement for color images by joint transform correlator Tao Liu a,b , Peng Ge a , Qi Li a,∗ , Huajun Feng a , Zhihai Xu a a b

State Key Lab of Modern Optical Instrumentation, Zhejiang University, Hangzhou 310027, China College of Optical and Electronic Technology, China JiLiang University, Hangzhou 310018, China

a r t i c l e

i n f o

Article history: Received 19 June 2011 Accepted 15 November 2011

Keywords: Image processing Displacement measurement Color image Joint transform correlator

a b s t r a c t Joint transform correlator (JTC) can be used in motion detection for gray images. Motion measurement for color images based on JTC is proposed. Color images are decomposed into red, green and blue channels independently. New monochromic images are formed by specially arranging the three channels. The displacement of color images is calculated out utilizing JTC by measuring the new monochromic target and reference image. Experimental results are given out. The technique gives fast and precise displacement detection for color images. © 2011 Elsevier GmbH. All rights reserved.

1. Introduction Joint transform correlator (JTC) has been widely used in pattern recognition and motion tracking since Weaver and Goodman proposed it in 1966 [1]. Janschek proposed using JTC to measure the shift between imaging devices and principle axis in order to restore the blurred images in real-time [2]. They made use of JTC to find out the correlation peaks related to the displacement vector. Yi proposed a hybrid opto-digital JTC using one optical transform and up-sampling digital technology obtaining the result that rootmean-square error (RMSE) of motion measurement accuracy could be controlled below 0.05 pixels [3]. We proposed image displacement measurement using phase-encoded target JTC [4]. But these literatures deal with only monochromatic images. Even when the measured images are colored, they simply transform the color images into gray images. In fact, most of the visual signals are chromatic. Not only the shapes but also the colors of the input images are indispensable characteristics for pattern recognition and image registration. Thus extending JTC to displacement measurement for color inputs is a very necessary task. We propose a displacement measurement for color images based on joint transform correlator. In this technique the color reference and color target images are decomposed into RGB channels, forming new target and reference images by especially spatial arrangement and displaying on an input plane side by side. The input plane is gray scaled, thus real-time operation is possible with the use of amplitude spatial light modulators (SLMs). Correlation

output will be obtained after two optical Fourier transform. The displacement between the color reference and the color target images is obtained through measuring the position of the cross-correlation peaks. Due to the optical speed of computation, it is estimated that it could reach at least thousands of correlations per second [5]. 2. Methods For a color reference image, we decompose it into rR , rG , rB components which represent the red, the green and the blue channel respectively. The new monochromic reference is formed by specially arranging the three components on the left part of an input SLM as shown in Fig. 1 that is r(x, y) = [rR (x − a, y − b) + rG (x − a, y) + rB (x − a, y + b)]

(1)

a, b in Eq. (1) are constants. The color target image has a displacement (xi , yi ) relative to the reference image, so the new monochromic target is displayed on the right part of the input SLM as shown in Fig. 1 and expressed as t(x, y) = tR (x + a + xi , y − b + yi ) + tG (x + a + xi , y + yi ) +tB (x + a + xi , y + b + yi )

(2)

where tR , tG , tB are the red, green and blue components of the color target image. The joint input image f(x, y) including new monochromatic target and reference image is displayed on the SLM and expressed as, f (x, y) = r(x, y) + t(x, y) = [rR (x − a, y − b) + rG (x − a, y)

∗ Corresponding author. E-mail address: [email protected] (Q. Li). 0030-4026/$ – see front matter © 2011 Elsevier GmbH. All rights reserved. doi:10.1016/j.ijleo.2011.11.032

+rB (x − a, y + b)] + [tR (x + a + xi , y − b + yi ) +tG (x + a + xi , y + yi ) + tB (x + a + xi , y + b + yi )]

(3)

T. Liu et al. / Optik 124 (2013) 162–165

163

−i2ua − iuxi − ivyi ] + 2|TR (u, v)||TG (u, v) × | cos[TR (u, v) − TG (u, v) − ivb] + 2|TR (u, v) × ||TB (u, v)| cos[TR (u, v) − TB (u, v) − i2vb] +2|TG (u, v)||TB (u, v)| cos[TG (u, v) − TB (u, v) − ivb] (5) Correlation output will be obtained after an inverse Fourier transform is applied to Eq. (5). There are 15 peaks existing in the correlation output plane. The former six items in Eq. (5) will produce a huge autocorrelation in the output plane. Cross-correlation item between the same channel includes the displacement information (xi , yi ), and is expressed as C(u, v) = 2|RR (u, v)||TR (u, v)| cos[RR (u, v) − TR (u, v) − i2ua −iuxi − ivyi ] + 2|RG (u, v)||TG (u, v)| cos[RG (u, v) −TG (u, v) − i2ua − iuxi − ivb − ivyi ] Fig. 1. Arrangement of new monochromatic reference and target images displayed on input SLM.

+2|RB (u, v)||TB (u, v)| cos[RB (u, v) − TB (u, v) −i2ua − iuxi − ivyi ]

(6)

Its Fourier spectrum is expressed as followed F(u, v) = FT 2{f (x, y)} = FT 2[rR (x − a, y − b) + rG (x − a, y)

Applied an inverse Fourier transform to Eq. (6), it will become o(x, y) = rR (x, y) ⊗ tR (x, y) ⊗ ı(x ± (2a + xi ), y ± yi )

+rB (x − a, y + b)] + FT 2[tR (x + a + xi , y − b + yi )

+rG (x, y) ⊗ tG (x, y) ⊗ ı(x ± (2a + xi ), y ± yi )

+tG (x + a + xi , y + yi ) + tB (x + a + xi , y + b + yi )]

+rB (x, y) ⊗ tB (x, y) ⊗ ı(x ± (2a + xi ), y ± yi )

= RR (u, v) exp(−iua − ivb) + RG (u, v) exp(−iua) +RB (u, v) exp(−iua + ivb) + TR (u, v) exp[iu(a + xi ) −iv(b − yi )] + TG (u, v) exp[iu(a + xi ) + ivyi ] +TB (u, v) exp[iu(a + xi ) + iv(b + yi )]

(4)

where FT2 represents two dimensional Fourier transform, R(u, v), T(u, v) are Fourier transformation of r(x, y), t(x, y). And joint power spectrum (JPS) is the square of magnitude of spectrum, expressed as JPS(u, v) = |F(u, v)|2 = |RR (u, v)|2 + |RG (u, v)|2 + |RB (u, v)|2 +|TR (u, v)|2 + |TG (u, v)|2 + |TB (u, v)|2 +2|RR (u, v)||RG (u, v)| cos[RR (u, v) − RG (u, v) − ivb)] +2|RR (u, v)||RB (u, v)| cos[RR (u, v) − RB (u, v) − i2vb)] +2|RR (u, v)||TR (u, v)| cos[RR (u, v) − TR (u, v) −i2ua − iuxi − ivyi ] + 2|RR (u, v)||TG (u, v)| cos × [RR (u, v) − TG (u, v) − i2ua − iuxi − ivb − ivyi ] +2|RR (u, v)||TB (u, v)| cos[RR (u, v) − TB (u, v) −i2ua − iuxi − i2vb − i2vyi ] + 2|RG (u, v)||RB (u, v) × | cos[RG (u, v) − RB (u, v) − ivb)] + 2|RG (u, v) × ||TR (u, v)| cos[RG (u, v) − TR (u, v) − i2ua − iuxi +ivb − ivyi ] + 2|RG (u, v)||TG (u, v)| cos[RG (u, v) −TG (u, v) − i2ua − iuxi − ivyi ] + 2|RG (u, v)||TB (u, v) × | cos[RG (u, v) − TB (u, v) − i2ua − iuxi − ivb − ivyi ] +2|RB (u, v)||TR (u, v)| cos[RB (u, v) − TR (u, v) −i2ua − iuxi + i2vb − ivyi ] + 2|RB (u, v)||TG (u, v) × | cos[RB (u, v) − TG (u, v) − i2ua − iuxi + ivb − ivyi ] +2|RB (u, v)||TB (u, v)| cos[RB (u, v) − TB (u, v)

Fig. 2. (a) The color reference image. (b) The color target image.

(7)

164

T. Liu et al. / Optik 124 (2013) 162–165

Fig. 3. Color image displacement measurement using JTC. (a) The monochromic input image. (b) 2D plot of the correlation output. (c) 3D plot of the correlation output.

Fig. 5. Experimental result. (a) Input image. (b) Spectrum of (a). (c) Correlation output. (d) Correlation output captured when laser 2 is tuned down.

Fig. 4. Experimental set-up.

T. Liu et al. / Optik 124 (2013) 162–165

a

0.4

0.35 0.3 0.25 0.2 0.15 0.1 0.05 0

b

0

2

4

6

8

10

12

14

16

18

20

0.4

0.35 0.3 0.25 0.2

165

displayed on SLM1. CCD1 captures its joint power spectrum (JPS) at the back focal plane of Fourier transform (FT) lens 1. Then JPS is displayed on SLM2 through Computer 2. Through another Fourier transformation CCD2 captures its correlation output at the back focal plane of FT lens 2. Finally digital processing is applied to correlation output to obtain displacement in sub-pixel precision. The wavelength of laser is 650 nm. Focal length is 300 mm. SLM we use is an amplitude type modulation made by BNS Corporation, named Model A512. Its resolution is 512 × 512 pixels and maximum frame rate can reach up to 1024 Hz. A camera named MC1360 made by Mikrotron, with up to 506 frames per second at full resolution, 1280 × 1024 pixels, is used as CCD1 and CCD2. It can reach up to 120,000 frames per second at reduced resolution. Input image is shown in Fig. 4(a) and its spectrum is shown in Fig. 4(b). Correlation-output is shown in Fig. 4(c). We can see there are 15 peaks existing in the output plane. We tune down the intensity of laser 2. Then the cross-correlation is much easier to be obtained from Fig. 5(d). Performance on a video is shown in Fig. 6(a) and (b), which represent detection error on x axis and on y axis respectively. The mean absolute errors are 0.18 and 0.14 pixels, stating that our technique could detect displacement between color images precisely. 4. Conclusion

0.15 0.1 0.05 0

0

2

4

6

8

10

12

14

16

18

20

Fig. 6. Performance over a sequence. (a) Errors on x axis. (b) Errors on y axis.

We propose a displacement measurement for color images. Simulation and experimental results show that our proposal could detect displacement precisely and at a very fast frame rate. We achieve up to 1000 correlation per second in the present optical set-up. Acknowledgements

where ı represents Dirac function. The cross-correlation peak which includes displacement information will locate at (2a + xi , yi ) and (−2a − xi , −yi ). So the displacement (xi , yi ) is obtained by finding this pair of peaks.

This research was supported by The National Natural Science Foundation of China (Grant No. 61178064), National Basic Research Program 973) of China (Grant No. 2009CB724004) and the Fund for Creative Research of Aerospace Science and Technology.

3. Simulation and experiment results

References

Simulation result is given. The color reference image is shown in Fig. 2(a) and the color target image is shown in Fig. 2(b). Their resolution is both 256 × 256 pixels. The real displacement between them is (−9.32, 9.43) pixels. The joint input image including three channels of part of the reference and the target images is shown in Fig. 3(a). 2D plot of the correlation output is shown in Fig. 3(b). 3D plot of the correlation output is shown in Fig. 3(c). The crosscorrelation’s position is to be found to obtain the displacement as (−9.21, 9.31) pixels. The absolute errors are (0.11, 0.12) pixels. Experiments are taken to verify our proposal. Our experimental set-up is shown in Fig. 4. The new input monochromatic image is

[1] C.S. Weaver, J.W. Goodman, A technique for optically convolving two functions, Appl. Opt. 5 (7) (1966) 1248–1249. [2] K. Janschek, V. Tchernykh, S. Dyblenko, Performance analysis of optomechatronic image stabilization for a compact space camera, Control Eng. Pract. 15 (2007) 333–347. [3] H.W. Yi, H. Zhao, Y.C. Li, et al., Improved digital processing method used for image motion measurement based on hybrid opto-digital joint transform correlator, Chin. Opt. Lett. 8 (10) (2010) 989–992. [4] P. Ge, Q. Li, H.J. Feng, Z.H. Xu, Displacement measurement using phase-encoded target joint transform correlator, Opt. Eng. 50 (3) (2011) 03820101–03820111. [5] K. Janschek, V. Tchernykh, S. Dyblenko, G. Flandin, B. Harnisch, Compensation of focal plane image motion perturbations with optical correlator in feedback loop, in: Proceedings of SPIE – The International Society for Optical Engineering, Maspalomas, Spain, 2004, pp. 280–288.