Optics and Lasers in Engineering ∎ (∎∎∎∎) ∎∎∎–∎∎∎
Contents lists available at ScienceDirect
Optics and Lasers in Engineering journal homepage: www.elsevier.com/locate/optlaseng
High dynamic range imaging for fringe projection profilometry with single-shot raw data of the color camera Yongkai Yin a, Zewei Cai b, Hao Jiang b, Xiangfeng Meng a, Jiangtao Xi c, Xiang Peng b,n a Shandong Provincial Key Laboratory of Laser Technology and Application, Department of Optics, School of Information Science and Engineering, Shandong University, Jinan 250100, China b Key Laboratory of Optoelectronic Devices and Systems of Ministry of Education and Guangdong Province, College of Optoelectronic Engineering, Shenzhen University, Shenzhen 518060, China c School of Electrical, Computer and Telecommunications Engineering, University of Wollongong, Wollongong, NSW 2522, Australia
art ic l e i nf o
a b s t r a c t
Article history: Received 26 January 2016 Received in revised form 3 June 2016 Accepted 27 August 2016
It is a challenging issue to get satisfied results in terms of 3D imaging for shiny surface with fringe projection profilometry (FPP), as the wide variation of surface reflectance for shiny surface will lead to bad exposure, which requires the high dynamic range imaging (HDRI) technique. HDRI with monochromatic illumination and single-shot raw data of the color camera is presented in this paper. From the single-shot raw data, 4 monochrome sub-images corresponding to R, G, G and B channels can be separated respectively. After the attenuation ratios between R&G, G&B channels are calibrated, an image with higher dynamic range can be synthesized with the 4 sub-images, which can help to avoid the impact of bad exposure and improve the accuracy of phase calculation. Experiments demonstrate the validity of proposed method for shiny surface. & 2016 Elsevier Ltd. All rights reserved.
Keywords: Fringe projection profilometry High dynamic range imaging (HDRI) Shiny surface High reflectance Single-shot Attenuation ratio calibration
1. Introduction Optical three-dimensional (3D) imaging, which is an important technique that can record and analysis the real world, has developed a lot under the push of growing demands and advances in technology. It has been widely employed in various fields, including industrial manufacturing, culture heritage, medical diagnosis, entertainment [1]. As a method with the advantages of high accuracy, high data density and flexible setup, fringe projection profilometry (FPP) has attracted a lot of attention for a long period [2]. From the work principle of FPP, it can be seen that rough surface with approximate Lambert's reflectance is the most suitable target for 3D imaging with FPP. However, with the expanding application of FPP, the observation targets become more and more complex, which introduces quite different surface reflectance. Depending on the different light reflectance, the surfaces of opaque objects can be generally divided into three categories: the rough surface, the shiny surface and the specular surface [3]. As shown in Fig. 1, shiny surface has the reflection characteristics between rough surface and specular surface, which presents hybrid reflectance containing partial diffuse reflection and partial n
Corresponding author. E-mail addresses:
[email protected] (Y. Yin),
[email protected] (X. Peng).
specular reflection. It is well known that for certain incident light intensity, diffuse reflection will lead to global low reflect light intensity while specular reflection will lead to high reflect light intensity in specific direction. In this case, imaging for shiny surface will surfer form bad exposure, including under exposure (corresponding to diffuse reflection area) and over exposure (corresponding to specular reflection area), which has negative impact on the accuracy of information acquisition. Therefore, 3D imaging for shiny surface with traditional FPP is hard to get highly accurate 3D reconstruction results. However, as most objects made of metal or ceramic naturally have shiny surface and take a large proportion in common industrial and civilian products, the disadvantages of traditional FPP for shiny surface greatly limit its application in the fields of industrial inspection and culture heritage. The main attempt to realize 3D imaging for shiny surface with FPP focuses on achieving high dynamic range imaging (HDRI) by modifying the illumination and acquisition. Kowarschik et al. [4] change the intensity of structural illumination to eliminate the bad exposure. Meanwhile, to compensate the influence of specular reflections or shadowed areas, up to 15 light projection directions are used. In order to avoid over and under exposure in the image, Koninckx et al. [5] adjust the local intensity ranges in the projected patterns based on a crude estimate of the scene geometry and reflectance characteristics. Ri et al. [6] define the phase reliability evaluation value using the Fourier transform (PREV/FT), which is
http://dx.doi.org/10.1016/j.optlaseng.2016.08.019 0143-8166/& 2016 Elsevier Ltd. All rights reserved.
Please cite this article as: Yin Y, et al. High dynamic range imaging for fringe projection profilometry with single-shot raw data of the color camera. Opt Laser Eng (2016), http://dx.doi.org/10.1016/j.optlaseng.2016.08.019i
Y. Yin et al. / Optics and Lasers in Engineering ∎ (∎∎∎∎) ∎∎∎–∎∎∎
2
system parameters θ are determined by system model and configuration, which can be estimated with system calibration. Once the system is calibrated, the result of 3D reconstruction only depends on the phase information φu. Thus phase recovery is the key step of FPP. One of the most commonly used methods to calculate the wrapped phase is the well-known phase-shifting algorithm, which can be formulated as follows:
⎡ N ⎛ 2πk ⎞⎤ ⎛ 2π k ⎞ N c ⎟⎥ ⎟/ ∑ Ik cos⎜ φw = arctan⎢ ∑ Ikc sin⎜ ⎝ N ⎠⎥⎦ ⎝ N ⎠ k=1 ⎢⎣ k = 1 Fig. 1. The reflectance of shiny surface.
applied to merge data with different exposure times and acquisition directions. Zhang et al. [7] analysis the characteristics of fringe images used in phase-shifting and then produce the final fringe images from multi-exposure images pixel-by-pixel by choosing the brightest but unsaturated corresponding pixels from one exposure. Liu et al. [8] first set minimal and maximal high precision measurable gray scale by experiment, then calculate the mask image sequence, and finally generate each pixel of the fringe images by selecting the brightest within the set range corresponding pixel from one set of fringe images. Jiang et al. [9] fuse the raw fringe patterns acquired with different camera exposure time and illumination intensity to generate a synthetic fringe image. Fringe image fusion is achieved by choosing the pixels in the raw fringes with the highest fringe modulation intensity. In summary, some approaches change the projection direction with the help of additional mechanical device or projector, which will increase the complexity of system. Some approaches reduce the maximal intensity of projector to avoid over exposure, which will lead to lower fringe quality. All of these approaches rely on multiple exposures with different system configuration, which take the expense of low time efficient. Therefore, it is worth to present a method that can generate high dynamic range image within relatively short time, which is the motivation of this work. HDRI with single-shot raw data of the color camera is presented in this paper. A fringe pattern is projected based on monochromatic illumination and the deformed fringe image is captured in single-shot fashion using a color camera. From the single-shot raw data, 4 monochrome sub-images corresponding to R, G, G and B channels can be separated respectively. The subimages have different intensity attenuation because the quantum efficiency of R, G and B channels are quite different for specific wavelength. After the attenuation ratios between R&G, G&B channels are calibrated, a higher dynamic range (HDR) image can be synthesized with the 4 sub-images. The following content is organized as follows: Section 2 briefly reviews the principle of FPP and then explain the necessity of HDRI. Section 3 presents the details of proposed HDRI with single-shot raw data. Section 4 shows a group of experimental results which demonstrate the validity of proposed approach. Section 5 is the conclusion.
2. FPP for shiny surface 2.1. Principle of FPP The typical work process of FPP is schematically shown in Fig. 2. Standard fringe patterns are generated and projected onto the surface of object, and the deformed fringe images modulated by the 3D shape of the surface are captured by the camera. Then the wrapped phase map φw modulo 2π can be calculated and unwrapped into continuous phase map φu, with which the 3D range image X can be reconstructed. Generally, there is a certain mapping form φu to X, which is expressed as X = hθ( φu). The
(1) Ikc
where N is the total step number of phase-shifting, and is the kth phase-shifted fringe image. If there is any abnormal factor when Ikc captured, extra error will be introduced into the phase information, and then reduces the accuracy of final 3D reconstruction. 2.2. Captured fringe image from shiny surface In general, the captured fringe image Ikc can be written as:
Ikc = s rIkp + Ia ,
(
)
k = 1, 2, …, N
(2) Ikp
where s is the camera sensitivity, r is the surface reflectance, is the standard fringe pattern projected to the surface, and Ia is the ambient light. For a certain camera, s is a constant. When the system is working in the dark, the ambient light can be ignored, which means Ia ≈ 0. The standard fringe pattern is generated in computer, so Ikp can be determined in advance. Practically, the only uncontrollable factor is the surface reflectance r, especially for shiny surface. As shown in Fig. 1, the reflectance of shiny surface is relatively small in general case, while in some directions, it will become very large. Thus the intensity of captured fringe image reflected from shiny surface may vary in a large range. Acquisition of such fringe image with traditional methods will lead to bad exposure and then introduce error to the calculated phase map. Therefore, HDRI should be employed to avoid the bad exposure and improve the accuracy of phase calculation.
3. HDRI with color camera Traditional HDRI for FPP of shiny surface mainly relies on multiple exposures with different system configuration. Options of either the illumination or the acquisition or both of them are modified during multiple exposures to achieve different responses to the intensity of fringe image. Final HDR image is generated from selection or combination of the multiple images. However, due to multiple exposures, all of these approaches take the expense of low time efficient. To overcome this disadvantage, HDRI with single-shot raw data of the color camera is proposed. 3.1. Principle For single-chip color camera, Bayer filter mosaic shown in Fig. 3 (a) is commonly used to separate pixels into R, G and B channels. Due to the wavelength selectivity of Bayer filter, the three channels have different quantum efficiency curves versus wavelength. Therefore, with monochromatic illumination, e.g. blue light, the quantum efficiency of R, G and B channel will has the relationship of eb 4 eg 4er. In this case, the Bayer filter can be regarded as a pixel varying neutral density filter [10], the brightness of which demonstrates its intensity attenuation, as shown in Fig. 3(b). For high attenuation such as er, it can accept very high luminance without saturation. While for low attenuation such as eb, it is more sensitive to low luminance, as shown in Fig. 3(c). When the fringe pattern is projected with monochromatic illumination and the
Please cite this article as: Yin Y, et al. High dynamic range imaging for fringe projection profilometry with single-shot raw data of the color camera. Opt Laser Eng (2016), http://dx.doi.org/10.1016/j.optlaseng.2016.08.019i
Y. Yin et al. / Optics and Lasers in Engineering ∎ (∎∎∎∎) ∎∎∎–∎∎∎
3
Ip: Standard fringe patterns Ic: Deformed fringe images φw: Wrapped phase φu: Unwrapped phase X: Range image
Ip
Ic
Phase calculation
Phase unwrapping
φw
φu
3D reconstruction
X
System calibration Projection
Acquisition
Object Fig. 2. Schematic work process of FPP.
Fig. 3. HDRI with color camera. (a) Bayer filter, (b) Corresponding neutral density filter, (c) Sensitivity of different attenuation.
deformed fringe image is captured in single-shot fashion using a color camera, 4 monochrome sub-images corresponding to R, G, G and B channels can be separated from the raw data. And then the attenuation ratios between R&G, G&B channels, which are denoted with rr/g and rg/b, can be calibrated. Finally the HDR image can be synthesized with the 4 sub-images and calibrated attenuation ratios.
4. Estimate the attenuation ratios rr/g and rg/b with the intensity values of common pixels. After attenuation ratio calibration, the intensity Ii in different sub-images can be unified into the same attenuation level to generate the HDR image:
IHDR =
∑ wirIi i,
i = r , g1, g2 , b
(3)
i
3.2. Technical details
where ri are the unified attenuation ratios of each Ii related to Ib: Actually, both the quantum efficiency curves of camera and the illumination wavelength of projector cannot be exactly known. Thus the attenuation ratios rr/g and rg/b, which are determined by the quantum efficiency corresponding to the illumination wavelength, need to be estimated in prior. The process of estimating the attenuation ratio is called attenuation ratio calibration. If the scene radiance is assumed to vary smoothly, the adjacent pixels can be considered to receive the same radiance. Thus the relative intensity values of adjacent pixels are only influenced by their attenuations. Meanwhile, arising from the mosaic arrangement of Bayer filter, the raw data of single-shot contains all the data of 3 color channels, i.e., any adjacent 2 2 pixels always contain the 3 different attenuations corresponding to er, eg and eb respectively. Based on the above premise, the attenuation ratio can be calibrated with the following process:
rb = 1,
rg1 = rg2 = rg / b,
rr = rr / grg / b
(4)
And wi are the weights which can help to keep the smoothness of intensity fusion [11]:
⎧ ⎪ 2( Ii − Imin )/ ( Imax − Imin ), wi = ⎨ ⎪ ⎩ 2( Imax − Ii )/( Imax − Imin ), = r , g1, g2 , b
Ii ≤ 0.5( Imax + Imin ) Ii > 0.5( Imax + Imin )
i (5)
It is reasonable that the intensities close to Imax/Imin are assigned low weights, as they usually suffer from bad exposure thus have low reliability.
4. Experiments 1. Separate the raw data into 4 sub-images corresponding to R, G, G, B channels respectively. 2. Discard pixels with nearly saturated or very low intensity values using appropriate thresholds. 3. Detect the commonly reserved pixels between sub-images R&G, G&B.
To verify proposed method, a piece of machined Aluminum is taken as the shiny surface and illuminated with blue fringe pattern from a Dell M110 projector, the raw data is acquired with a Nikon D200 digital single lens reflex (SLR) camera. The captured singleshot raw data under uniform illumination is shown in Fig. 4,
Please cite this article as: Yin Y, et al. High dynamic range imaging for fringe projection profilometry with single-shot raw data of the color camera. Opt Laser Eng (2016), http://dx.doi.org/10.1016/j.optlaseng.2016.08.019i
4
Y. Yin et al. / Optics and Lasers in Engineering ∎ (∎∎∎∎) ∎∎∎–∎∎∎
Fig. 4. Single-shot raw data and an enlarged view of partial image (marked with a green square). (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)
Fig. 5. Separated sub-images corresponding to R, G, G, B channels respectively. (a) R channel, (b) G1 channel, (c) G2 channel, (d) B channel.
Fig. 6. Masks which demonstrate the commonly reserved pixels between sub-images after discarding pixels with nearly saturated or very low intensity values. (a) Subimages R&G, (b) Sub-images G&B.
Please cite this article as: Yin Y, et al. High dynamic range imaging for fringe projection profilometry with single-shot raw data of the color camera. Opt Laser Eng (2016), http://dx.doi.org/10.1016/j.optlaseng.2016.08.019i
Y. Yin et al. / Optics and Lasers in Engineering ∎ (∎∎∎∎) ∎∎∎–∎∎∎
2
x 10
4
6
1.5
x 10
5
4
4
1
2
0.5 0
0
0.2 0.4 Attenuation ratio between R&G
0 0
0.6
0.2 0.4 Attenuation ratio between G&B
(a)
0.6
(b)
Fig. 7. Histogram of the attenuation ratio and fitted nonparametric kernel-smoothing distribution (the red curve). (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)
(a) 4
6
x 10
2500
4
Intensity
Intensity
2000
2
1500 1000 500
0 350
400 pixel
450
(b)
0 850
900 pixel
950
(c)
Fig. 8. Results of generating the HDR image. (a) The generated HDR image. To display the HDR image more clearly, the dynamic range of the intensity has been compressed with tone mapping. (b), (c) The real intensity distributions of two sample lines in (a). (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)
where there exists significant over exposure due to high reflectance. And the mosaic arrangement of different attenuation can be clearly seen in the enlarged view. The separated 4 subimages from the raw data of the deformed fringe image within the region of interest are shown in Fig. 5. For illumination with blue light, R channel has the biggest attenuation while B channel has the smallest attenuation, which can be illustrated with Fig. 5 (a) and (d). The two G channels have almost the same attenuation, as demonstrated in Fig. 5(b) and (c). In order to calibrate the attenuation ratio more reliable, pixels with nearly saturated
(I 40.9*Imax) or very low (I o0.1*Imax) intensity values are discarded. And then the commonly reserved pixels between subimages R&G, G&B are detected and demonstrated with masks shown in Fig. 6. With the intensity values of common pixels, the histograms of attenuation ratios between R&G, G&B are fitted with nonparametric kernel-smoothing distribution, which can give the optimal attenuation ratios (0.195 for R&G, 0.283 for G&B), as shown in Fig. 7. The final HDR image generated with Eq. (3) is shown in Fig. 8(a), where the image has been processed with tone mapping technique to make it suitable for display. Furthermore,
Please cite this article as: Yin Y, et al. High dynamic range imaging for fringe projection profilometry with single-shot raw data of the color camera. Opt Laser Eng (2016), http://dx.doi.org/10.1016/j.optlaseng.2016.08.019i
Y. Yin et al. / Optics and Lasers in Engineering ∎ (∎∎∎∎) ∎∎∎–∎∎∎
6
0.1
Fitting error
0.08
R G1 G2 B HDR
0.06
0.04
0.02
0 Low
Mid
High
Average intensity Fig. 9. The local fitting errors of the phase map from R, G, G, B and HDR fringe images.
the real intensity distributions of two sample lines, which are marked with green lines in Fig. 8(a), are plotted in Fig. 8(b) and (c) separately. It can be seen that the HDR image has good fringe contrast throughout the whole view, regardless of whether there is extreme reflectance. To further demonstrate the effectiveness of proposed approach, fringe pattern sequence consisting of 5-step phase-shifting and Gray code are employed to reconstruction the surface of the Aluminum workpiece. For a calibrated FPP system, the accuracy of 3D reconstruction only depends on that of the phase map. Therefore, the performance of proposed HDRI algorithm can be evaluated by comparing the accuracy of the phase map recovered from different fringe sequences. As the surface of the Aluminum workpiece is flat, the phase map within a selected region can be fitted with a bivariate polynomial, and the local fitting error can well indicate the accuracy of the phase map. From captured raw data, 4 sequences of fringe image corresponding to R, G, G, B channels are extracted and with those the sequence of HDR fringe image is generated. And then five phase maps are recovered from five fringe sequences corresponding to R, G, G, B and HDR images. In order to make an overall evaluation, 20 regions with size of 50*50 pixel are randomly selected throughout the whole phase map. And the 20 regions are sorted according to the average intensity within each region in the HDR fringe image. For each phase map, fitting error is
calculated in the same 20 regions and arranged in the same order of ascending average intensity to get a broken line. The five lines corresponding to the local fitting error of the phase map from R, G, G, B and HDR fringe images are shown in Fig. 9. In order to clearly show the difference between adjacent lines, the display range of fitting error is limited within [0, 0.1], which leads to some large values in the line B exceeding the display area. It is obviously that under exposure (low average intensity) which relates to the left end of line R and over exposure (high average intensity) which relates to the right end of line G and B have larger fitting error. While the line HDR almost always has the least fitting error throughout the full range of average intensity, which indicates that the proposed HDRI algorithm can help to avoid the impact of bad exposure and improve the accuracy of phase calculation for shiny surface. The reconstructed 3D surface of the Aluminum workpiece with proposed HDRI algorithm is shown in Fig. 10. It can be seen that there is no significantly rugged area throughout the whole surface, which means the negative impact of shiny surface has been well controlled.
5. Conclusion HDRI with single-shot raw data of the color camera is proposed for FPP of shiny surface. Proposed approach only require single exposure, which greatly reduces the time expense. The shiny surface is illuminated with monochromatic fringe pattern and the deformed fringe image is captured in single-shot fashion using a color camera. Sub-images corresponding to R, G, G, B channels are separated from the single-shot raw data. And then the attenuation ratios between R&G, G&B are calibrated, with which the final HDR image can be generated. With synthetic HDR images, the accuracy of phase calculation can be improved thus the negative impact of shiny surface can be controlled. The validity of proposed approach is demonstrated with a group of experimental results.
Acknowledgement The financial supports from the National Natural Science Foundation of China (NSFC) under the grant 61405122, 61275014, 61377017 and that from the Scientific and Technological Project of Shenzhen Government (JCYJ20140509172609158, JCYJ20140828163633999) is gratefully acknowledged. This work is also supported by the Fundamental Research Funds of Shandong
Fig. 10. The reconstructed 3D surface of the Aluminum workpiece with proposed HDRI algorithm.
Please cite this article as: Yin Y, et al. High dynamic range imaging for fringe projection profilometry with single-shot raw data of the color camera. Opt Laser Eng (2016), http://dx.doi.org/10.1016/j.optlaseng.2016.08.019i
Y. Yin et al. / Optics and Lasers in Engineering ∎ (∎∎∎∎) ∎∎∎–∎∎∎
University and the Key Laboratory of Optoelectronic Devices and Systems of Ministry of Education and Guangdong Province (GD201608).
References [1] Sansoni G, Trebeschi M, Docchio F. State-of-the-art and applications of 3D imaging sensors in industry, cultural heritage, medicine, and criminal investigation. Sensors 2009;9(1):568–601. [2] Gorthi SS, Rastogi P. Fringe projection techniques: whither we are? Opt Lasers Eng 2010;48(2):133–40. [3] Nayar SK, Ikeuchi K, Kanade T. Surface reflection: physical and geometrical perspectives. IEEE Trans Pattern Anal Mach Intell 1991;13(7):611–34. [4] Kowarschik R, Kuhmstedt P, Gerber J, Schreiber W, Notni G. Adaptive optical three-dimensional measurement with structured light. Opt Eng 2000;39 (1):150–8.
7
[5] Koninckx TP, Peers P, Dutre P, Van Gool L. Scene-adapted structured light. In: Proceedings of IEEE conference on computer vision and pattern recognition (CVPR 2005); 2005, p. 611–18. [6] Ri S, Fujigaki M, Morimoto Y. Phase reliability evaluation in phase-shifting method using fourier transform for shape measurement. Opt Eng 2005;44 (8):083601. [7] Zhang S, Yau S-T. High dynamic range scanning technique. Opt Eng 2009;48 (3):033604. [8] Liu G-h, Liu X-Y, Feng Q-Y. 3D shape measurement of objects with high dynamic range of surface reflectivity. Appl Opt 2011;50(23):4557–65. [9] Jiang H, Zhao H, Li X. High dynamic range fringe acquisition: a novel 3-D scanning technique for high-reflective surfaces. Opt Lasers Eng 2012;50 (10):1484–93. [10] Nayar SK, Mitsunaga T. High dynamic range imaging: Spatially varying pixel exposures. In: Proceedings of IEEE conference on computer vision and pattern recognition (CVPR 2000) 1; 2000. p. 472–479. [11] Reinhard E, Heidrich W, Debevec P, Pattanaik S, Ward G, Myszkowski K. High dynamic range imaging: acquisition, display, and image-based lighting. 2nd edn.. San Francisco: Morgan Kaufmann; 2010.
Please cite this article as: Yin Y, et al. High dynamic range imaging for fringe projection profilometry with single-shot raw data of the color camera. Opt Laser Eng (2016), http://dx.doi.org/10.1016/j.optlaseng.2016.08.019i