Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques

Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques

Optics and Lasers in Engineering 50 (2012) 1097–1106 Contents lists available at SciVerse ScienceDirect Optics and Lasers in Engineering journal hom...

2MB Sizes 0 Downloads 59 Views

Optics and Lasers in Engineering 50 (2012) 1097–1106

Contents lists available at SciVerse ScienceDirect

Optics and Lasers in Engineering journal homepage: www.elsevier.com/locate/optlaseng

Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques Z.H. Zhang School of Mechanical Engineering, Hebei University of Technology, Tianjin 300130, China

a r t i c l e i n f o

abstract

Available online 26 January 2012

Full-field fringe projection techniques have been widely studied in academia and applied in industrial fields because of the advantages of non-contact operation, fast full-field acquisition, high accuracy and automatic data processing. Phase data map is calculated from one or multiple captured fringe pattern images on the measured object surface, which are called as single-shot and multiple-shot 3D measurement methods. Although multiple-shot methods can give highly accurate data for measuring static objects, it could be degraded by disturbance, such as vibration and environmental noises between gap of image shot. However, single-shot methods are insensitive to vibrational noises because of capturing only one image. Therefore, various single-shot methods have been actively researched recently with the advent of new imaging and projecting devices. This paper reviews the single-shot 3D shape measurement techniques by projecting and capturing one fringe pattern image on the object surface, the wrapped phase demodulation algorithms from one captured image. The challenging problems and future research directions are discussed to advance single-shot 3D shape measurement techniques. & 2012 Elsevier Ltd. All rights reserved.

Keywords: Single-shot measurement Fringe analysis Phase measurement Optical metrology Three-dimensional shape measurement

1. Introduction Full-field fringe projection techniques have been widely studied in academia and applied in industrial fields because of the advantages of non-contact operation, full-field acquisition, high accuracy, automatic and fast data processing [1–3]. The fringe patterns can be sinusoidal [4], binary or Ronchi [5,6], triangular [7,8], trapezoidal [9], saw tooth [10], and so on. Among them, sinusoidal and Ronchi fringe patterns are mostly studied because sub-fringe phase modulo 2p can be obtained from one or multiple captured fringe pattern images, which are called as single-shot and multiple-shot 3D measurement methods, respectively. Although multiple-shot methods give highly accurate measurement data for static objects by capturing more than three-fringe pattern images, it could be degraded by disturbance, such as vibration and movement between gap of image shot. Some researchers tried to solve such problem by using high speed CCD cameras and digital micromirror device (DMD)-based projecting systems (for example, the DLPs DiscoveryTM 4100 from Texas Instruments) to fast acquire three phase-shifted fringe pattern images [11]. Although the wrapped phase map of a moving object can be measured by the three-step phase-shift algorithm, the relative movement between successive captured

E-mail addresses: [email protected], [email protected]. 0143-8166/$ - see front matter & 2012 Elsevier Ltd. All rights reserved. doi:10.1016/j.optlaseng.2012.01.007

images causes the shifted phase to an unknown random value, so the method gives inaccurate phase (shape) data. Therefore, in order to measure moving objects and to be against vibration, a good solution is capturing only one gray or color image to calculate phase data map, instead of multiple fringe pattern images. Single-shot methods are insensitive to vibrational noises since only one image needs to be captured. Until now, various singleshot methods have been actively studied and they can be generally classified into discrete and continuous coding methods [12]. Single-shot discrete coding patterns present a digital profile having the same intensity value for the region represented by the same codeword, so that the size of this region determines the density of the measured object surface. In most cases, the discrete code occupies several pixels, so the resolution is low and the obtained 3D shape data are inaccurate for measuring objects having complex surfaces. Although some discrete coding methods can fast measure object surface from a single-shot image [13–17], they are not within the scope of this review paper. Single-shot continuous coding patterns present a smooth profile where every pixel has a unique codeword on intensity or color, so that the obtained data are dense surface reconstruction of the measured object shape. Among single-shot continuous coding methods, phase calculation-based techniques are widely utilized because of high accuracy of the measured phase map. Many algorithms have been studied to calculate phase data, for

1098

Z.H. Zhang / Optics and Lasers in Engineering 50 (2012) 1097–1106

example, the multiple-step phase-shift algorithm [18], the Fourier transform algorithm [19,20], the windowed Fourier transform algorithm [21–24], the wavelet transform algorithm [25–29], the local model fitting algorithm [30–32], the spatial phase-shift algorithm [33–35], the frequency-guided sequential demodulation algorithm [36–38], the regularized phase tracking algorithm [39,40], to name a few. These algorithms calculate the wrapped phase map, modulo 2p, which requires being unwrapped for obtaining 3D shape data. In principle, phase unwrapping is a straightforward process by identifying phase difference greater than 2p of spatially neighboring pixels for smooth object surface. However, the actual deformed fringe patterns may contain noises, large slopes and/or discontinuities, so the difficulties lead to many developments of phase unwrapping methods. With the advent of color CCD cameras and color projectors, especially DMD-based projecting devices, the red, green and blue channels of color images have been used as carrier to code fringe patterns [41–50]. In each color channel, the extracted fringe pattern can be regarded as a normal gray one, which means a composite color RGB image contains three-fringe patterns. However, in order to cover the whole wavelength, mostly used color CCD cameras and projectors have overlapped spectrum, so crosstalk between color channels is unavoidable. Another challenging issue in color fringe projection is chromatic aberration coming from the optic lens of the imaging and projecting system. Most recently, color fringe projection techniques have been utilized to measure colorful objects [51,52]. This paper reviews the existing single-shot fringe pattern projection techniques on measuring 3D shape of objects surface based on phase calculation-based methods. The following section first briefly introduces the general procedure of calculating 3D shape data from a single-shot fringe pattern image. Then, according to stages of capturing and processing fringe pattern image, the different strategies to generate one gray or color fringe pattern image, the algorithms to calculate wrapped phase map from the single captured image, will be elaborated in Sections 3 and 4, respectively. Some experimental results were demonstrated by projecting one composite RGB fringe pattern image onto a human hand and demodulating the wrapped phase data via wavelet transform algorithm in Section 5. Section 6 discusses the challenging problems and future research directions in order to advance single-shot 3D measurement techniques. Finally, the conclusive remarks are given in Section 7.

2. Procedure of single-shot 3D shape measurement Phase calculation-based fringe projection profilometry mainly includes the following several parts: fringe pattern generation and projection, acquisition of deformed fringe pattern on the measured object surface, wrapped and unwrapped phase calculation, 3D calibration, and absolute phase to 3D shape conversion. The procedure of 3D shape measurement from a single-shot image is presented by a flowchart, as shown in Fig. 1. One or multiple sinusoidal fringe patterns can be formed using either a Ronchi ruling, grating, laser interferometer or computer generated fringe pattern image and projected onto the measured object surface by a projecting device. From another viewpoint, the fringe patterns are deformed with respect to the measured object shape and an imaging device captures one deformed fringe pattern image. Normally, for single-shot full-field metrology, area CCD or CMOS cameras are used to acquire the deformed fringe patterns. Demodulating the phase information from the captured image obtains the wrapped phase data, which needs to be unwrapped for shape calculation. The obtained phase data correspond to the shape of the measured objects and their relationship needs to be

Generating an image containing one or multiple fringe patterns Projecting device Projecting the fringe pattern image onto object surface Area imaging device A single gray or color image with deformed fringe pattern(s) Wrapped phase demodulation Wrapped phase data map from the captured image Phase unwrapping Unwrapped phase data map corresponding to object shape 3D calibration 3D shape data Fig. 1. Flowchart of 3D shape measurement by a single-shot acquisition.

established by a procedure called as 3D calibration. This paper reviews the existing fringe generation and projection methods, wrapped phase data demodulation techniques from one gray or color image. The phase unwrapping and 3D calibration techniques are out of the scope of the paper and the interested readers can refer to Refs. [53–59].

3. Generation of single fringe pattern image As described in previous Section 2, fringe pattern can be generated by different ways. The fringe pattern generated by software in a computer and then projected by projecting devices are mostly used in recent years. These devices have the advantages of flexibility, easy control and high contrast, especially the advent of some new projecting devices, such as the digital light processing (DLP) Pico Projector, DLP Light Commander Kit and DLP Discovery Kit series. Single or multiple fringe patterns can be modulated into a single gray or color image. In order to easily calculate phase data map, especially the unwrapped phase map, fringe pattern coding scheme of gray and color image has been widely studied. In general, the strategy of fringe pattern generation is classified as gray and color image. For color image, red, green and blue channels are carrier to code fringe pattern. Each fringe pattern in one color channel can be regarded as a gray fringe pattern image. 3.1. Gray image The usage of a gray image means usage of only one channel with respect to one of the three red, green and blue channels of a color image. The gray image may contain one or multiple fringe patterns. 3.1.1. One fringe pattern This method is the simplest one in single-shot fringe pattern image, since a gray image just contains one fringe pattern for phase calculation. In 1980s, Takeda introduced one-shot sinusoidal fringe pattern to measure smooth object surface by Fourier

Z.H. Zhang / Optics and Lasers in Engineering 50 (2012) 1097–1106

1099

transform algorithm [19]. Afterwards, many researchers applied one fringe pattern projection to actual application fields [20], for example, vibration monitoring [60], and dynamic 3D shape measurement [61]. 3.1.2. Multiple fringe patterns This method modulates multiple fringe patterns into one gray image. These fringe patterns have constant phase shift in between for wrapped phase calculation, or have certain fringe numbers for phase unwrapping. One approach was developed based on concepts from communications theory, where the individual fringe patterns are spatially modulated by different carrier frequencies along the orthogonal direction and summed to generate a composite pattern image [62], as illustrated in Fig. 2(a). By combining multiple sinusoids with different two-component spatial carrier frequencies into one fringe pattern map, Takeda et al. proposed a single-shot acquisition technique to measure 3D shape of objects having discontinuities and spatially isolated surfaces [63], as demonstrated in Fig. 2(b). Sansoni and Redaelli presented a oneshot method by directly overlapping two sinusoidal gratings with different frequencies into one fringe pattern image [64]. Later, Sansoni et al. proposed another method to calculate the wrapped phase information from a single Ronchi fringe pattern, but this method needs very complicated post-processing and is sensitive to noise [65]. Since two or multiple fringe patterns are accumulated in one image [62–65], the modulation of fringe pattern has small range of gray level and the phase resolution is poor. Therefore, these methods cannot be applied to highly accurate measurement fields. Another novel single-shot fringe projection measurement technique uses a micro-polarizer pixilated camera [66]. A phase mask having wire grids at angles of 01, 451, 901 and 1351 was assembled into the CCD camera, so that phase shift of 901 among four adjacent pixels can be realized. However, it has to sacrifice the spatial resolution because four phase-shifted fringe patterns are acquired by a single CCD array. Later, Rodriguez-Zurita et al. extended the technique to five, seven, and nine interferograms, which means one-shot image contains five, seven, and nine fringe patterns, respectively [67]. However, this technique needs projecting polarized light, for example, laser. The similar technique has been applied to unpolarized white light to realize single-shot measurement. By building up accurate pixel-to-pixel correspondence between a CCD camera and a DMD to control each pixel independently, Ri et al. proposed a singleshot 3D shape measurement method by coding multiple fringe patterns into one image [68,69]. 3.2. Color image A color image contains red, green and blue channels, which are three carriers to code fringe patterns. The fringe pattern extracted from one color channel can be regarded as a gray image, so one color image has three times information than one gray image. According to the coded number of sinusoidal fringe patterns, this technique can be further classified as one and multiple fringe pattern methods. 3.2.1. One fringe pattern Sitnik [35] presented a four-dimensional (4D—three-dimensional shape varying in time) shape measurement system from one captured image. The projected pattern is composed of one sinusoidal intensity fringe and one color-encoded stripe. The sinusoidal intensity fringe is used to calculate phase information, while the color-encoded stripe is for determining the absolute sinusoidal fringe order.

Fig. 2. Composite pattern images formed by (a) modulating traditional fringe patterns along the orthogonal direction and (b) multiplexing two fringe patterns.

Chen et al. and Su described a composite fringe pattern projection method that combining encoded color strips and sinusoidal intensity fringes into the same fringe pattern image for single-shot 3D shape acquisition [48,49], as shown in Fig. 3. The color strips offer absolute order of each sinusoidal fringe, whilst sinusoidal fringes give sub-wavelength phase information. Since the strips and sinusoidal fringes are overlapped at each pixel, the captured color strips and fringe modulation depth have low intensity. Therefore, it is difficult to identify the edge of the color strips and to obtain accurate phase information. In order to measure dynamic and spatially isolated objects, Su presented another coded composite fringe pattern method by overlapping sinusoidal fringes, binary stripes and color grids at each pixel position into one color image [47]. However, the use of binary

1100

Z.H. Zhang / Optics and Lasers in Engineering 50 (2012) 1097–1106

Fig. 3. Color fringe pattern image: (a) composite pattern containing of color encoded strips and sinusoidal intensity fringes and (b) color-encoded fringe pattern containing of a set of color-encoded strips and sinusoidal fringes. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

stripes and color grids offer even lower intensity levels, and hence edges become more difficult to identify.

3.2.2. Multiple fringe patterns More than two sinusoidal fringe patterns are coded into major color channels of a RGB image. Wust et al. and Huang et al. presented methods to code three sinusoidal fringe patterns having 2p/3 phase shift in between into the red, green and blue channels of a color image [41,44,70], as demonstrated in Fig. 4(a). The wrapped phase information can be calculated by the standard three-step phase-shifting algorithm from one captured composite color fringe pattern image. Zhang et al. proposed a novel approach to code three sinusoidal fringe patterns having the optimum three-fringe numbers into the three channels of a color image [50], as illustrated in Fig. 4(b).pffiffiffiffi The projected optimum threefringe numbers are N,N1,N N, which resolves fringe order ambiguity as the beat obtained between N and N  1 is a single fringe over the full field of view and the reliability of the obtained fringe order is maximized as fringe order calculation is performed

Fig. 4. Composite color fringe pattern images: (a) three sinusoidal fringe patterns having 2p/3 phase shift in between coded into the red, green and blue channels, (b) three sinusoidal fringe patterns having optimum fringe numbers of 42, 48 and 49 coded into the red, green and blue channels, respectively, and (c) two color channels encoded as sine and cosine fringe images, and the third channel encoded as a stair image. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

through a geometric series of beat fringes. After extracting the fringe pattern from the three color channels, three wrapped phase data maps are calculated by FT algorithm. The absolute phase and then 3D shape data information are obtained by applying the optimum three-fringe number selection method to the three wrapped phase maps. In the meanwhile, color information is extracted from the zero spectrum in the three channels.

Z.H. Zhang / Optics and Lasers in Engineering 50 (2012) 1097–1106

This approach can obtain shape and color information of an object simultaneously from a single composite RGB fringe pattern image. Karpinsky et al. proposed one method to modulate sinusoidal and cosinusoidal fringe pattern into the primary red and green channels, while the blue channel as a stair function for phase unwrapping [71], as shown in Fig. 4(c). The wrapped phase data information is calculated from the sinusoidal and cosinusoidal fringe patterns. The stair function determines the fringe order so that the unwrapped phase data can be correctly obtained even the measured objects having isolated surfaces.

4. Wrapped phase demodulation As stated in Section 1, there are many phase calculation methods. Since one gray or color image may contain one or multiple fringe patterns, all the mentioned methods can be used to calculate wrapped phase information from a single-shot image. Generally speaking, there is no ideal fringe analysis technique that can solve the very wide variety of different practical applications. Therefore, many wrapped phase demodulation methods have been studied.

4.1. Multiple-step phase-shift Multiple-step phase-shift algorithm is the widely used technique to calculate phase data information from more than threefringe patterns [18]. This algorithm needs to know the shifted degrees or have the same shifted degrees among fringe patterns [72,73]. Multiple-step phase-shift algorithm can give accurate phase information, so many commercial products adopt it to measure static objects. For measuring dynamic or moving objects, successively capturing multiple images will cause large phase error. In order to apply this algorithm to single-shot fringe projection technique, the major red, green and blue channels of a color image have been used to code three-fringe patterns [41,44,70]. From one captured color image, three-fringe patterns can be extracted and regarded as the normal fringe pattern for phase calculation. Then the standard three-step phase-shift algorithm calculates the wrapped phase data.

4.2. Fourier transform Fourier transform (FT) algorithm is the mostly used phase calculation method from one fringe pattern. The principle of FT is to extract the spectrum of first harmonic of the fringe pattern and then to use the inverse FT to reconstruct the phase information, so the major problem of FT is the separation of the first harmonic term from the spectrum of the background illumination of a fringe pattern. However, FT is a global operation that is usually used for the analysis of a stationary signal, because it has a poor capacity for localizing the signal properties. The first harmonic is often disturbed by local noise and discontinuous points. Furthermore, a large residual phase errors usually appears at the pixel positions near the boundaries or discontinuities because of Gibbs effects. A description of the non-stationary signal in the frequency domain by means of a simple FT provides results with a large degree of error. The errors arise when the first harmonic is superposed by the other higher harmonics, making complete extraction of the first harmonic impossible. Therefore, FT applies only to stationary signals, and it loses all information about the time localization of a given frequency component. To separate the fundamental spectrum from the other spectra, one must limit the measurable slope of the phase caused by the

1101

height modulation of the object as [19] 9@j=@x9max o 2pf 0 =3

ð1Þ

where j represents the local phase data, f0 is the fundamental frequency of the observed spatial carrier-fringe pattern. FT profilometry has been extensively studied in order to overcome its limitation [74–76]. Chen et al. introduced a method to eliminate zero spectrum in FT profilometry by using a window function, so that the measuring range can be extended nearly to three times that of conventional FT [74]. Vander et al. have presented a technique of using two complementary interferograms can improve the spatial resolution of the reconstructed image by a factor of 2 compared with that obtained from ordinary FT [75]. 4.3. Widowed Fourier transform The deformed fringe patterns on a measured object tend to resemble non-stationary signals and they usually cover wide range of frequencies. Windowed Fourier transform (WFT) is a modified version of Fourier transform. WFT has been used for demodulation of fringe patterns and to provide time–frequency representation with better signal localization [21–24]. This technique uses a fixed window length and has some limitations as it gives a fixed resolution at all times. However, fringe patterns may have high phase changes and WFT are not robust enough to cope with this issue. Some improvements have been presented to automatically selecting window size in WFT [24,77]. Zhong et al. presented a multi-scale WFT (MWFT) method based on an instantaneous frequency gradient to calculate phase information [77]. The local stationary length of signal automatically controls the window width. Fernandez et al. proposed automatic window size selection method in WFT by using adapted mother wavelets [24]. One obvious disadvantage of WFT is the high computation cost. 4.4. Wavelet transform Wavelet transform (WT) has good performance at multiresolution and localization in the time- and space–frequency domain. Because it is well localized in both the time–space and the frequency domains, WT can represent the signal in a robust way. Therefore, WT can be used to analyze non-stationary signals, i.e., the carrier-fringe patterns projected on a complicated object surface in 3D shape measurement. Unlike FT, the basis functions of the WT are not unique. There are two approaches for obtaining the phase of a fringe pattern by WT, namely the phase estimation and phase gradient method. The phase estimation extracts modulo 2p phase, which needs a phase unwrapping algorithm to remove the 2p jumps; while the phase gradient method calculates the instantaneous frequencies, which are integrated to directly obtain the unwrapped phase information. Gdeisat et al. have proved that the phase estimation method outperforms the phase gradient method for both noisy and noise-free fringe patterns [29]. Zhong et al. introduced WT method to extract phase distribution from one fringe pattern [25]. In another paper, the theoretical analysis of the phase of the WT on the ridge is equal to the phase of the optical fringe pattern [78]. Most recently, 2D WT have been widely studied to get robust phase information. Gdeisat et al. presented a 2D CWT algorithm to demodulate fringe pattern and the experimental results show that 2D CWT has better noise immunity than the 1D CWT algorithm [28,79]. To cope with the phase determination ambiguity issue, Ma et al. presented a phase determination rule according to the phase distribution continuity, and then a frequency-guided scheme to obtain the correct phase distribution following a conventional 2DCWT analysis [80]. Due to the usage of a scale and translating factor, WT needs more time to process

1102

Z.H. Zhang / Optics and Lasers in Engineering 50 (2012) 1097–1106

fringe pattern. Huang et al. compared FT, WFT and WT algorithms in extracting wrapped phase data from a single fringe pattern image [81].

4.5. Regularized phase tracking Servin et al. present a two-dimensional regularized phasetracking (RPT) technique that is capable of demodulating a single fringe pattern with either open or closed fringes [39,40]. The proposed RPT system gives the detected phase continuously so that no further unwrapping is needed over the detected phase. This method is powerful for demodulating a single closing fringe pattern. However, the main drawbacks of RPT are its sensibility at the fringe pattern modulation and the time computation cost. Some modifications have been presented to improve the performance. Legarda-Saenz et al. suggested an improvement to add one term that models the fringe-pattern modulation [40]. With this new term, the RPT technique can be used to demodulate nonnormalized fringe patterns. Later, Legarda-Saenz et al. proposed another improvement method to including a rough estimate of the fringe-pattern modulation and the linearization of the fringepattern model so that the processing time greatly reduces [82].

4.6. Spatial phase shift Spatial phase-shifting (SPS) algorithm is a spatial domain processing technique to directly apply temporal phase-shifting algorithm to retrieve phase data from a single fringe pattern. Because multiple phase-shifted fringe patterns contained in an image, the spatial resolution is low. Chan et al. presented a SPS method by subdividing a linearized fringe pattern into three-component images, from which the phase is calculated by a three-step phase-shift algorithm [33]. This method needs the resolution of the fringe image is high, the phase with respect to the depth varies slowly and the spatial carrier frequency is high. The wavelength of the spatial carrier wave also needs to be known beforehand in order to correctly subdivide a fringe pattern into three-fringe pattern images. One limitation of this SPS method is that the resolution of the computed phase distribution in the direction vertical to fringe is reduced by a factor of the wavelength of the spatial carrier wave. The other is the systematic errors increase with large gradient of phase surface. Sitnik [35] applied adaptive spatial carrier phase shifting (ASCPS) algorithm to calculate the wrapped phase map from one sinusoidal fringe pattern. The local period of fringes is roughly estimated by the FT technique. Multiple fringe patterns having constant phase shift in between can be extracted from the sinusoidal fringe pattern image by using the obtained local period.

5. Experimental results By generating a single composite RGB fringe pattern image and projecting it onto a human hand, one example was demonstrated to obtain the absolute phase information. Three-fringe patterns having the optimum fringe numbers of 49, 48 and 42 were coded into the blue, green and red channels of a color image, as shown in Fig. 4(b). The generated color image was projected via a DLP projector onto the human hand and the deformed fringe patterns were simultaneously captured by a color CCD camera, as illustrated in Fig. 5. After compensating for the crosstalk and chromatic aberration between color channels [45,84], three fringe patterns were extracted from the captured color image, as shown in Fig. 6. Applying wavelet transform (WT) algorithm to the three fringe patterns obtained three wrapped phase maps, as demonstrated in Fig. 7. Fig. 8 illustrates the obtained unwrapped phase map by using the optimum three-fringe number selection method [45,85–87].

6. Future research directions Although great amount of researches have been done to demodulate phase data from a single-shot image, there do many challenging problems need to be further studied. The main points include: fast phase data processing, complicated shape measurement of objects having discontinuities and/or large slopes, small objects measurement with micro- and nano-scale size, nonlinear response of imaging and projecting devices, chromatic aberration and crosstalk between color channels when using a color fringe pattern image. 6.1. Computation time The transform-based algorithms need long time to calculate phase information, so the captured fringe pattern image is post processed in most case. However, some applications, for example, 3D on-line inspection, need the captured data to be processed in a real-time way. Zhang et al. proposed graphics processing unit (GPU)-assisted method to process the fringe pattern image. GPU has been used to reduce computation time in their method [88]. Gao et al. introduced a fast parallel WFT-based library using GPU to process fringe pattern image and the processing time is greatly

4.7. Local model fitting Local model fitting (LMF) method is based on the assumption that the measured surface is local flat [30,31]. The phase of each pixel is estimated from pixel values in its vicinity by least squares techniques. The method can measure objects having discontinuities. However, the measurement accuracy will be degraded if the local surface is curved. Later, the authors presented an iteratively reweighted LMF method to weight the contribution of the local pixels according to the degree of satisfaction of the local flat assumption [32]. Most recently, further improvement was proposed to automatically determine the shape and size of local regions only from the single fringe pattern image [83].

Fig. 5. Captured composite RGB image on human hand having the optimum fringe numbers of 42, 48 and 49 in the red, green and blue channels, respectively. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

Z.H. Zhang / Optics and Lasers in Engineering 50 (2012) 1097–1106

1103

Fig. 6. Extracted fringe patterns from the captured composite color image: (a) red channel, (b) green channel, and (c) blue channel.

Fig. 7. Calculated wrapped phase map by using WT algorithm. (a) red channel, (b) green channel, and (c) blue channel.

reduced [89]. Most recently, Gao et al. reviewed the parallel computing methods in optical measurement [90].

algorithm and its modifications can solve this problem, they need to capture multiple fringe pattern images having different frequencies. In order to measure objects having discontinuities by a single-shot acquisition, Burton et al. proposed a multichannel Fourier fringe analysis technique [91]. More than one fringe patterns are projected onto the measured object surface and the spatial frequency bandwidths of the fringe patterns need to be completely separated in the frequency domain of the FT. Takeda et al. developed a single-shot measuring method by combining multiple sinusoids with different two-component spatial carrier frequencies into one fringe pattern map to obtain 3D shape of objects having large height discontinuities and/or surface isolations [63]. Due to multiple fringe patterns superimposed into one gray image, the modulation of fringe patterns has small range of gray level and the phase resolution is poor. Therefore, these methods cannot be applied to highly accurate measurement fields. One solution is to use the red, green and blue channels of a color image to code multiple fringe patterns [50]. Zhang et al. studied one single-shot method to code three fringe patterns having the optimum three-fringe numbers into red, green and blue channels. Three wrapped phase maps are calculated from the three color channels and the absolute phase data obtained by the optimum three-fringe number selection method. Due to the capability of independently calculating each pixel phase information, this method can measure moving objects having discontinuities and/or isolated surfaces.

6.2. Discontinuities

6.3. Nonlinear response

For phase calculation-based fringe projection method, a challenging problem is to measure objects having discontinuities and/ or isolated surfaces. Although temporal phase unwrapping

CCD cameras and DLP projectors are mostly used in the 3D imaging systems. The nonlinear response of gamma distortion leads to the deviation of the captured fringe pattern from ideal sinusoidal

Fig. 8. The obtained unwrapped phase map from the single composite RGB image in Fig. 5.

1104

Z.H. Zhang / Optics and Lasers in Engineering 50 (2012) 1097–1106

waveforms and introduces an additional phase error, decreases the accuracy and resolution of the measurement. Therefore, before measurement, the gamma nonlinear response needs to be corrected. Guo et al. proposed a gamma-correction technique to estimate the value of gamma from the normalized cumulative histogram of the fringe images [92]. Their method just corrects gamma value of the projector. Pan et al. proposed an approach suffering from an inaccurate gamma calibration and had to employ an approximate and simplified one-order function equation [93]. Liu et al. described a detailed mathematical gamma model to determine the relationship between phase error and gamma [94]. Gamma calibration is performed by computing the energy in the higher-order harmonics on a pixel-by-pixel basis. Li et al. proposed a thorough theoretical model of the gamma-distorted fringe image derived from an optical perspective to implement gamma correction [95]. Although these proposed nonlinear correction methods improve the accuracy to certain degree, further study needs to be done in highly accurate application fields.

novel linear compensation method to eliminate lateral chromatic aberration in a color fringe projection system [84]. However, this method just applies to the optimum multiple-fringe selection method. In a general color fringe projection system, one has to consider the effects of chromatic aberration on phase calculation. 6.7. Colorful objects measurement There are some objects having colorful surfaces, such as arts and antiques. Measuring their 3D shape has great demands in the fields such as 3D games, sculpture, artwork and heritage. Even in a static state, it is difficult to measure the 3D shape of objects having colorful surfaces. Zhang et al. and Chen et al. developed methods to measure objects having colorful surface by using the major red, green and blue channels of color images as a carrier [46,50–52]. All the methods need to capture multiple color fringe pattern images. It is a big challenging problem how to measure colorful objects by a single-shot acquisition method.

6.4. Micro- and nano-objects measurement 7. Conclusions With the development of Microelectro Mechanical Systems (MEMS) components, their 3D shape measurement is becoming more and more important for fast assembling and automatic controlling. These MEMS components may have millions of individual parts, for example, the spacers in liquid crystal display (LCD) panels and bumpers in semiconductor wafers. How to measure fastly and accurately their 3D shape is a challenging problem. In 2009, Mohan and Rastogi organized a special issue called Recent developments in Interferometry for Microsystems Metrology in the Journal of Optics and Lasers in Engineering [96]. The recent process of micro-interferometric measurement techniques for microsystems metrology were discussed from several aspects of miniaturization of mechanical, electro-mechanical, opto-mechanical, and micro-fluidic systems. Recently, Jang et al. presented a high-speed in-line inspection technique by using two-wavelength phase-shift interferometry to extend the measuring range up to 10 mm [97]. By capturing 10 images, the proposed method can measure LCD spacer and/or semiconductor wafers and the speed reaches 30 Hz. The authors pointed out the measuring speed can be 500 Hz by using high speed CCD camera and graphics processing unit (GPU). 6.5. Crosstalk For most color CCD cameras and DLP projectors, the spectra of red, green, and blue channels are designed to overlap so that there are no color-blind areas. However, the regions between the color bands are often at different wavelengths in CCD cameras than those in DLP projectors. Hence, the information captured in each of the three color channels is not independent, which means the fringe patterns coded into the color channels cannot be directly used for phase calculation. They need to be compensated by software- or hardware-based methods [43–45,50]. The existing methods can reduce crosstalk to a certain extent. However, more efforts need to be made to completely remove crosstalk between color channels. 6.6. Chromatic aberration Due to the optical properties of the imaging and projecting lens, chromatic aberration between color channels is unavoidable if color image is used. It will change the relative fringe spacing among the three color channels, so the obtained phase data are inaccurate without any compensation. Zhang et al. presented a

Single-shot methods by phase calculation-based fringe projection techniques are becoming an interested research direction because of the great demands from some actual applications in 3D on-line inspection, multi-media, medical imaging, human– computer interaction (HCI) and security industries. Single-shot methods are insensitive to vibrational noises because of capturing only one image, so they can apply to accurately measure moving objects. This paper reviews the recent progress of single-shot acquisition, the wrapped phase demodulation from a single-shot gray or color image including one or multiple fringe patterns. The paper also points out the future research directions in order to advance single-shot 3D measurement techniques to more actual application fields. One or multiple fringe patterns can be coded into not only a gray but a color image. The modulation of fringe pattern has small range of gray level when two or multiple fringe patterns are accumulated in a gray image. With the advent of color devices, for example, color 3CCD cameras and color DLP projectors, color fringe projection methods are promising research directions if chromatic aberration and crosstalk between color channels can be completely eliminated. Each wrapped phase calculation technique has its own pros and cons. Although the transform-based methods can estimate phase data from one fringe pattern, the computation cost is high and the data are inaccurate at large slope and/or discontinuities. Multiple-step phase-shift algorithm and spatial phase-shift algorithm can give accurate wrapped phase data. However, the multiple fringe patterns obtained from single image have low spatial resolution and/or nonsinusoidal shape, so the calculated phase data are inaccurate. For some objects having complicated and special surface, such as discontinuities and/or large slope, shiny and color, there are many big unsolved challenging problems to measure their shape by using a single-shot method.

Acknowledgments The authors would like to thank the National Natural Science Foundation of China (61171048), the Key Project of Chinese Ministry of Education (no. 211016), Specialized Research Fund for the Doctoral Program of Higher Education (‘‘SRFDP’’) (no. 20111317120002), the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, Research Project supported by Hebei Education Department

Z.H. Zhang / Optics and Lasers in Engineering 50 (2012) 1097–1106

(no. ZD2010121) and Scientific Research Foundation for the Returned Overseas Scholars, Hebei Province.

References [1] Chen F, Brown GM, Song M. Overview of three-dimensional shape measurement using optical methods. Opt Eng 2000;39:10–22. [2] Blais F. Review of 20 years of range sensor development. J Electron Imaging 2004;13:231–40. [3] Gorthi SS, Rastogi P. Fringe projection techniques: whither we are? Opt Lasers Eng 2010;48:134–40. [4] Zhang ZH, Zhang D, Peng X, Hu XT. Performance analysis of a 3D full-field sensor based on fringe projection. Opt Lasers Eng 2004;42:341–53. [5] Su XY, Zhou WS, Bally GV, Vukicevic D. Automated phase-measuring profilometry using defocused projection of a Ronchi grating. Opt Commun 1992;94:561–73. [6] Lei S, Zhang S. Flexible 3-D shape measurement using projector defocusing. Opt Lett 2009;34:3080–2. [7] Jia PR, Kofman J, English CE. Two-step triangular-pattern phase-shifting method for three-dimensional object-shape measurement. Opt Eng 2007;46:083201. [8] Jia PR, Kofman J, English CE. Multiple-step triangular-pattern phase shifting and the influence of number of steps and pitch on measurement accuracy. Appl Opt 2007;46:3253–62. [9] Huang PS, Zhang S, Chiang FP. Trapezoidal phase-shifting method for threedimensional shape measurement. Opt Eng 2005;44:123601. [10] Chen LJ, Quan CG, Tay CJ, Fu Y. Shape measurement using one frame projected sawtooth fringe pattern. Opt Commun 2005;246:275–84. [11] Zhang S. Recent progresses on real-time 3D shape measurement using digital fringe projection techniques. Opt Lasers Eng 2010;48:149–58. [12] Salvi J, Fernandez S, Pribanic T, Llado X. A state of the art in structured light patterns for surface profilometry. Pattern Recognition 2010;43:2666–80. [13] Chen SY, Li YF, Guan Q, Xiao G. Real-time three-dimensional surface measurement by color encoded light projection. Appl Phys Lett 2006; 89:111108. [14] Chen SY, Li YF, Zhang JW. Vision processing for realtime 3-D data acquisition based on coded structured light. IEEE Trans Image Process 2008;17:167–76. [15] Wong AKC, Niu PY, He X. Fast acquisition of dense depth data by a new structured light scheme. Comput Vision Image Understand 2005;98:398–422. [16] Caspi D, Kiryati N, Shamir J. Range imaging with adaptive color structured light. IEEE Trans Pattern Anal 1998;20:470–80. [17] Koninckx TP, Gool LV. Real-time range acquisition by adaptive structured light. IEEE Trans Pattern Anal 2006;28:432–45. [18] Creath K. Phase measurement interferometry techniques. In: Wolf E, editor. Progress in optics, vol. XXVI. Amsterdam: North-Holland Publ.; 1988. p. 339–93. [19] Takeda M, Mutoh K. Fourier transform profilometry for the automatic measurement of 3D object shapes. Appl Opt 1983;22:3977–82. [20] Su X, Chen W. Fourier transform profilometry: a review. Opt Lasers Eng 2001;35:263–84. [21] Qian KM, Wang HX, Gao WJ. Windowed Fourier transform for fringe pattern analysis: theoretical analyses. Appl Opt 2008;47:5408–19. [22] Qian KM. Two-dimensional windowed Fourier transform for fringe pattern analysis: principles, applications and implementations. Opt Lasers Eng 2007;45:304–17. [23] Qian KM, Fu Y, Liu Q, Seah HS, Asundi A. Generalized three-dimensional windowed Fourier transform for fringe analysis. Opt Lett 2006;31:2121–3. [24] Fernandez S, Gdeisat MA, Salvi J, Burton D. Automatic window size selection in Windowed Fourier Transform for 3D reconstruction using adapted mother wavelets. Opt Commun 2011;284:2797–807. [25] Zhong JG, Weng JW. Spatial carrier-fringe pattern analysis by means of wavelet transform: wavelet transform profilometry. Appl Opt 2004;43: 4993–8. [26] Wang ZY, Ma HF. Advanced continuous wavelet transform algorithm for digital interferogram analysis and processing. Opt Eng 2006;45:045601. [27] Watkins LR. Phase recovery from fringe patterns using the continuous wavelet transform. Opt Lasers Eng 2007;45:298–303. [28] Abid AZ, Gdeisat MA, Burton D, Lalor MJ, Lilley F. Spatial fringe pattern analysis using the two-dimensional continuous wavelet transform employing a cost function. Appl Opt 2007;46:6120–6. [29] Gdeisat MA, Abid A, Burton DR, Lalor MJ, Lilley F, Moore C, et al. Spatial and temporal carrier fringe pattern demodulation using the one-dimensional continuous wavelet transform: recent progress, challenges, and suggested developments. Opt Lasers Eng 2009;47:1348–61. [30] Sugiyama M, Ogawa H, Kitagawa K, Suzuki K. Single-shot surface profiling by local model fitting. Appl Opt 2006;45:7999–8005. [31] Yokota T, Sugiyama M, Ogawa H, Kitagawa K, Suzuki K. Interpolated local model fitting method for accurate and fast single-shot surface profiling. Appl Opt 2009;48:3497–508. [32] Kurihara N, Sugiyama M, Ogawa H, Kitagawa K, Suzuki K. Iterativelyreweighted local model fitting method for adaptive and accurate single-shot surface profiling. Appl Opt 2010;49:4270–7. [33] Chan PH, Bryanston-Cross PJ, Parker SC. Spatial phase stepping method of fringe-pattern analysis. Opt Lasers Eng 1995;23:343–54.

1105

[34] Xu JC, Xu Q, Peng HS. Spatial carrier phase-shifting algorithm based on leastsquares iteration. Appl Opt 2008;47:5446–53. [35] Sitnik R. Four-dimensional measurement by a single-frame structured light method. Appl Opt 2009;48:3344–54. [36] Qian KM, Soon SH. Sequential demodulation of a single fringe pattern guided by local frequencies. Opt Lett 2007;32:127–9. [37] Li K, Qian KM. Fast frequency-guided sequential demodulation of a single fringe pattern. Opt Lett 2010;35:3718–20. [38] Wang HX, Qian KM. Frequency guided methods for demodulation of a single fringe pattern. Opt Express 2009;17:15118–27. [39] Servin M, Marroquin JL, Cuevas FJ. Demodulation of a single interferogram by use of a two-dimensional regularized phase-tracking technique. Appl Opt 1997;36:4540–8. ¨ [40] Legarda-Sa´enz R, Osten W, Juptner W. Improvement of the regularized phase tracking technique for the processing of nonnormalized fringe patterns. Appl Opt 2002;41:5519–26. [41] Wust C, Capson DW. Surface profile measurement using color fringe projection. Mach Vision Appl 1991;4:193–203. [42] Pfortner A, Schwider J. Red-green-blue interferometer for the metrology of discontinuous structures. Appl Opt 2003;42:667–73. [43] Huang PS, Zhang C, Chiang FP. High-speed 3D shape measurement based on digital fringe projection. Opt Eng 2003;42:163–8. [44] Huang PS, Hu QY, Jin F, Chiang FP. Color-encoded digital fringe projection technique for high-speed three-dimensional surface contouring. Opt Eng 1999;38:1065–71. [45] Zhang ZH, Towers CE, Towers DP. Time efficient color fringe projection system for 3D shape and color using optimum 3-frequency selection. Opt Express 2006;14:6444–55. [46] Zhang ZH, Towers CE, Towers DP. Phase and color calculation in colour fringe projection. J Opt A Pure Appl Opt 2007;9:S81–6. [47] Su WH. Color-encoded fringe projection for 3D shape measurements. Opt Express 2007;15:13167–81. [48] Chen HJ, Zhang J, Lv DJ, Fang J. 3-D shape measurement by composite pattern projection and hybrid processing. Opt Express 2007;15:12318–30. [49] Su WH. Projected fringe profilometry using the area-encoded algorithm for spatially isolated and dynamic objects. Opt Express 2008;16:2590–6. [50] Zhang ZH, Towers CE, Towers DP. Snapshot color fringe projection for absolute 3D metrology of video sequences. Appl Opt 2010;49:5947–53. [51] Zhang ZH, Towers CE, Towers DP. Robust colour and shape measurement of full colour artefacts by RGB fringe projection. Opt Eng 51, in press. [52] Chen LC, Nguyen XL, Zhang FH, Lin TY. High-speed Fourier transform profilometry for reconstructing objects having arbitrary surface colours. J Opt 2010;12:095502. [53] Zhang ZH, Ma HY, Guo T, Zhang SX, Chen JP. Simple, flexible calibration of phase calculation-based three-dimensional imaging system. Opt Lett 2011;36:1257–9. [54] Zhang ZH, Ma HY, Zhang SX, Guo T, Towers CE, Towers DP. Simple alibration of a phase-based 3D imaging system based on uneven fringe projection. Opt Lett 2011;36:627–9. [55] Vo M, Wang Z, Hoang T, Nguyen D. Flexible calibration technique for fringeprojection-based three-dimensional imaging. Opt Lett 2010;35:3192–4. [56] Huang L, Chua PSK, Asundi A. Least-squares calibration method for fringe projection profilometry considering camera lens distortion. Appl Opt 2010;49:1539–48. [57] Jia PR, Kofman J, English C. Comparison of linear and nonlinear calibration methods for phase-measuring profilometry. Opt Eng 2007;46:043601. [58] Ghiglia DC, Pritt MD. Two-dimensional phase unwrapping, theory algorithms, and software. New York: Wiley; 1998. [59] Su XY, Chen WJ. Reliability-guided phase unwrapping algorithm: a review. Opt Lasers Eng 2004;42:245–61. [60] Spagnolo GS, Paoletti D, Ambrosini D. Vibration monitoring by fiber optic fringe projection and Fourier transform analysis. Opt Commun 1997;139:17–23. [61] Su XY, Chen WJ, Zhang QC, Chao YP. Dynamic 3-D shape measurement method based on FTP. Opt Lasers Eng 2001;36:49–64. [62] Guan C, Hassebrook LG, Lau DL. Composite structured light pattern for threedimensional video. Opt Express 2003;11:406–17. [63] Takeda M, Gu Q, Kinoshita M, Takai H, Takahashi Y. Frequency-multiplex Fourier-transform profilometry: a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations. Appl Opt 1997;36:5347–54. [64] Sansoni G, Redaelli E. A 3D vision system based on one-shot projection and phase demodulation for fast profilometry. Meas Sci Technol 2005;16:1109–18. [65] Sansoni G, Trebeschi M, Docchio F. Fast 3D profilometer based upon the projection of a single fringe pattern and absolute calibration. Meas Sci Technol 2006;17:1757–66. [66] Novak M, Millerd J, Brock N, North-Morris M, Hayes J, Wyant J. Analysis of a micropolarizer array-based simultaneous phase-shifting interferometer. Appl Opt 2005;44:6861–8. [67] Rodriguez-Zurita G, Toto-Arellano NI, Meneses-Fabian C, Va´zquez-Castillo JF. One-shot phase-shifting interferometry: five, seven, and nine interferograms. Opt Lett 2008;33:2788–90. [68] Ri S, Fujigaki M, Matui T, Morimoto Y. Accurate pixel-to-pixel correpondence adjustment in a digital micromirror device camera by using the phaseshifting moire method. Appl Opt 2007;47:6940–6.

1106

Z.H. Zhang / Optics and Lasers in Engineering 50 (2012) 1097–1106

[69] Ri S, Fujigaki M, Morimoto Y. Single-shot three-dimensional shape measurement method using a digital micromirror device camera by fringe projection. Opt Eng 2009;48:103605. [70] Pan JH, Huang PS, Chiang FP. Color phase-shifting technique for threedimensional shape measurement. Opt Eng 2006;45:013602. [71] Karpinsky N, Zhang S. Composite phase-shifting algorithm for three-dimensional shape compression. Opt Eng 2010;49:063604. [72] Carre P. Installation et utilisation du comparateur photoelectrique et interferentiel du Bureau International des Poids et Mesures. Metrologia 1966;2: 13–23. [73] Qian KM, Shu FJ, Wu XP. Determination of the best phase step of the Carre algorithm in phase shifting interferometry. Meas Sci Technol 2000;11: 1220–3. [74] Chen WJ, Su XY, Cao YP, Zhang QC, Xiang LQ. Method for eliminating zero spectrum in Fourier transform profilometry. Opt Lasers Eng 2005;43: 1267–76. [75] Vander R, Lipson SG, Leizerson I. Fourier fringe analysis with improved spatial resolution. Appl Opt 2003;42:6830–7. [76] Fu YJ, Jiang GY, Chen FY. A novel Fourier transform profilometry based on dual-frequency grating. Optik, in press, doi.org/10.1016/j.ijleo.2011.06.055. [77] Zhong JG, Zeng HP. Multiscale windowed Fourier transform for phase extraction of fringe patterns. Appl Opt 2007;46:2670–5. [78] Zhong J, Weng J. Phase retrieval of optical fringe patterns from the ridge of a wavelet transform. Opt Lett 2005;30:2560–3. [79] Gdeisat MA, Burton DR, Lalor MJ. Spatial carrier fringe pattern demodulation by use of a two-dimensional continuous wavelet transform. Appl Opt 2006; 45:8722–32. [80] Ma J, Wang ZY, Pan B, Hoang T, Vo M, Luu L. Two-dimensional continuous wavelet transform for phase determination of complex interferograms. Appl Opt 2011;50:2425–30. [81] Huang L, Qian KM, Pan B, Asundi AK. Comparison of Fourier transform,windowed Fourier transform, and wavelet transform methods for phase extraction from a single fringe pattern in fringe projection profilometry. Opt Lasers Eng 2010;48:141–8. [82] Legarda-Saenz R, Rivera M. Fast half-quadratic regularized phase tracking for nonnormalized fringe patterns. J Opt Soc Am A 2006;23:2724–31.

[83] Mori S, Sugiyama M, Ogawa H, Kitagawa K, Irie K. Automatic parameter optimization of the local model fitting method for single-shot surface profiling. Appl Opt 2011;50:3773–80. [84] Zhang ZH, Towers CE, Towers DP. Compensating lateral chromatic aberration of a colour fringe projection system for shape metrology. Opt Lasers Eng 2010;48:159–65. [85] Towers DP, Towers CE, Zhang ZH. Optical imaging of physical objects. International patent application number PCT/GB2007/003088. [86] Towers CE, Towers DP, Jones JDC. Optimum frequency selection in multifrequency interferometry. Opt Lett 2003;28:887–9. [87] Towers CE, Towers DP, Jones JDC. Absolute fringe order calculation using optimised multi-frequency selection in full-field porfilometry. Opt Lasers Eng 2005;43:788–800. [88] Zhang S, Royer D, Yau S-T. GPU-assisted high-resolution, real-time 3-D shape measurement. Opt Express 2006;14:9120–9. [89] Gao WJ, Huyen NT, Loi HS, Qian KM. Real-time 2D parallel windowed Fourier transform for fringe pattern analysis using Graphics Processing Unit. Opt Express 2009;17:23147–52. [90] Gao WJ, Qian KM. Parallel computing in experimental mechanics and optical measurement: a review. Opt Lasers Eng, 2012;50:608-17. [91] Burton DR, Lalor MJ. Multichannel Fourier fringe analysis as an aid to automatic phase unwrapping. Appl Opt 1994;33:2939–48. [92] Guo HW, He HT, Chen MY. Gamma correction for digital fringe projection profilometry. Appl Opt 2004;43:2906–14. [93] Pan B, Kemao LHQ, Asundi A. Phase error analysis and compensation for nonsinusoidal waveforms in phase-shifting digital fringe projection profilometry. Opt Lett 2009;34:416–8. [94] Liu K, Wang YC, Lau DL, Hao Q, Hassebrook LG. Gamma model and its analysis for phase measuring profilometry. J Opt Soc Am A 2010;27:553–62. [95] Li ZW, Li YF. Gamma-distorted fringe image modeling and accurate gamma correction for fast phase measuring profilometry. Opt Lett 2011;36:154–6. [96] Mohan NK, Rastogi PK. Recent developments in interferometry for microsystems metrology. Opt Lasers Eng 2009;47:199–280. [97] Jang R, Kang CS, Kim JA, Kim JW, Kim JE, Park HY. High-speed measurement of three-dimensional surface profiles up to 10 mm using two-wavelength phase-shifting interferometry utilizing an injection locking technique. Appl Opt 2011;50:1541–7.