Fringe projection profilometry based on complementary color-encoded fringe patterns

Fringe projection profilometry based on complementary color-encoded fringe patterns

Optics & Laser Technology 44 (2012) 2332–2339 Contents lists available at SciVerse ScienceDirect Optics & Laser Technology journal homepage: www.els...

2MB Sizes 0 Downloads 145 Views

Optics & Laser Technology 44 (2012) 2332–2339

Contents lists available at SciVerse ScienceDirect

Optics & Laser Technology journal homepage: www.elsevier.com/locate/optlastec

Fringe projection profilometry based on complementary color-encoded fringe patterns Feipeng Da n, Luyang Wang, Luyao Hu School of Automation, Southeast University, Nanjing 210096, China

a r t i c l e i n f o

a b s t r a c t

Article history: Received 23 December 2011 Received in revised form 16 April 2012 Accepted 20 April 2012 Available online 8 May 2012

3D measurement techniques based on color-encoded fringe pattern projection have been widely used in various fields of engineering recently. There is one problem that the surface color of measured object may interfere with the color of projected fringe pattern. To solve this problem, a novel method based on complementary color-encoded fringe projection is proposed. Two color-encoded fringe patterns whose fringe colors are complementary are designed. The first pattern is with the sinusoidal fringe embedding into the green color channel which is used to evaluate phases by Fourier transform method. The fringe color of the captured image is established with the help of the complementary color fringe pattern, which is based on the feature of color complementation. Thus, the influence brought by the color of object surface can be eliminated, and decoding errors can be further reduced. Experiment results indicate that the proposed method is valid and can be applied to the measured colorful objects. & 2012 Elsevier Ltd. All rights reserved.

Keywords: Optical inspection Phase unwrapping Color coding

1. Introduction 3-D shape measurement techniques have been widely researched and used in many fields of both science and engineering [1]. The existing methods can be mainly divided into two kinds. One is based on the stereo vision which needs a pair of images obtained from two cameras viewing from different angles. Matching the corresponding pair of pixels in both of the two captured images plays rather an important part in this method; however, it is actually difficult and not beneficial for real-time measurement [2]. The other one is the structured light based and it is named as the active measurement technique and they feature in the properties of noncontact detection, fast speed, high accuracy and extensive applications [3–5]. Among the early developed profilometry retrieval techniques, gray-scale patterns [6] are chosen to be projected onto the object, therefore the gray intensity is the only available information. With the combination of phase-shifting method [7] or Fourier profilometry [8–11], a sequence of gray patterns which are coded temporally are projected to provide the information for phase unwrapping [12]. Gai and Da [13] proposed a novel phase shifting method based on stripe mark, the stripe mark is embedded into the projected phase-shifting fringes to provide the stripe order for phase unwrapping. However, such methods have put a limitation to the measurement speed even though they ensure a high-accuracy

n

Corresponding author. Tel.: þ86 138 015 85 093; fax: þ 86 258 379 3000. E-mail address: [email protected] (F. Da).

0030-3992/$ - see front matter & 2012 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.optlastec.2012.04.028

result. Compared with the above methods, color-encoded fringe projection profilometry method is predominant in the sufficient information which is carried by the color images, phase unwrapping can be performed according to the color-encoded fringe patterns [14]. In the paper [15], a temporal encoding method of the color fringe patterns is proposed, but it requires several patterns. Su [16] presents a new approach based on a colorencoded fringe pattern embedded into a sinusoidal fringe pattern to reconstruct the 3-D surface shape of an object by only singleframe, which contributes to improve the measurement speed. Koninckx and Van Gool [17] propose a ‘‘self-adaptive’’ system for real-time range acquisition to solve the correspondence problem between camera and projector, using a combination of geometric coding, color coding and tracking over time. Sitnik [18] presents a four-dimensional and single-frame shape measurement system, which employ a fast Fourier transformation and a new phaseunwrapping routine based on the spatiotemporal information. Whether it is single-frame measurement or not, when the colorencoded fringe projection profilometry is applied to measure the non-white object, one problem has to be noticed that the surface color of the tested object may interfere with the fringe color and affect the result of surface reconstruction. To solve this problem, Zhang et al. [19] present a method to measure the colorful objects by projecting separated red, green, and blue color fringe patterns to the tested objects, however, it requires 3 shots and the results may be affected by the color imbalance problem. Pan et al. [20] present a color-encoded binary fringe projection technique that uses a combination of three primary colors to code the light patterns for 3-D shape measurement. In [21], a novel method

F. Da et al. / Optics & Laser Technology 44 (2012) 2332–2339

using a composite red, green, and blue (RGB) projected fringe pattern has been proposed, and an optimum multi-wavelength process is mentioned in the paper. In this paper, a novel method for the shape measurement of colorful objects based on complementary color-encoded fringe projection is proposed. Two color-encoded fringe patterns whose fringe colors are complementary are designed. The sinusoidal fringes are encoded into the first color-encoded fringe pattern and the primary phases are evaluated by the Fourier transform method. Phase unwrapping is carried out by the color coding information which can be obtained with the help of the complementary color fringe pattern and the effect caused by the surface color of objects can be largely reduced. Because of the one-shot characteristic of the FT method, when it is combined with the color coding algorithm, we just need one additional color pattern to implement the measurement with fast speed and low computation cost, the system robustness is improved in the meantime.

2333

Project the complementary colorencoded fringes onto the object

Capture the distorted fringe patterns

Separate the sinusoidal fringes from the color-encoded sinusoidal fringe pattern

Obtain the primary phase by the Fourier Transform mehod

Color segmentation of the color-encoded sinusoidal fringe pattern

Color segmentation of the complementary color-encoded fringe pattern

Determine the fringe colors and borders of the color-encoded sinusoidal fringe pattern

Decode the color-encoded information to obtain the fringe order

2. Measurement principles Phase unwrapping

Fig. 1 shows the optical geometry of the measurement system. The system consists of a DLP projector (Hcp-610x), a digital camera (Mega Plus II ES4020) with resolution: 2048  2048, a computer with Matrox Solios image card and the tested object. The measurement is performed by the procedure given by Fig. 2. When the absolute phase is obtained, the surface heights of the tested object can be calculated [22].

3-D reconstruction Fig. 2. Flow diagram of 3-D measurement based on complementary colorencoded fringe projection. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

3. Complementary color-encoded fringe patterns The complementary color-encoded fringe patterns consist of one color-encoded sinusoidal fringe pattern and one complementary color fringe pattern, the generated fringe patterns are presented in Fig. 3. Fig. 3(d) shows the color-encoded sinusoidal fringe pattern which is generated by embedding the sinusoidal fringe into the green color channel of the color-encoded fringe pattern shown as Fig. 3(a) and (c), respectively. Here, we embed the sinusoidal fringe pattern into the green color channel for the reason that the response of it in the camera is better than the other two channels [23], this is to say, the sinusoidal fringes can be easily restored from the green color channel. Fig. 3(b) shows the complementary color fringe pattern. The color-encoded fringe

Fig. 3. Schematic diagram of complementary color-encoded fringe patterns. (a) Color-encoded fringe pattern; (b) complementary color fringe pattern; (c) sinusoidal fringe pattern; (d) color-encoded sinusoidal fringe pattern. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

pattern is coded by gray code principle, i.e., code words of any two adjacent fringes will have only one different code digit. Every four adjacent fringes form a group and the sequence of color fringes included in the group appears only once in the whole fringe pattern. We choose white (red¼255, green¼255 and blue¼255), yellow(red¼255,green¼255 and blue¼0), green (red¼0, green¼ 255 and blue¼0) and cyan (red¼0, green¼255 and blue¼255) as the fringe color and describe each of them as code words (1 1 1),(1 1 0),(0 1 0),(0 1 1),respectively. Figs. 3(e) and (f) give the details of intensity distribution of red and blue color channels of Fig. 3(a), respectively. Using Eq. (1), the sinusoidal fringe pattern can be embedded into the green color channel then the Fig. 3(d) is generated. On the other hand, the complementary color fringe pattern can be easily obtained by setting the fringe color to be the complementary to one of the four colors above, this is to say, the values of red color channel for each pixel are established according to Eq. (2), Fig. 1. Optical geometry of measurement system based on fringe pattern projection.

f ðxÞ ¼ 255sinðpx=wÞ

ð1Þ

2334

F. Da et al. / Optics & Laser Technology 44 (2012) 2332–2339

Table 1 Digital numbers assigned to the corresponding fringe color. Fringe color

white

yellow

cyan

green

Digital number

1

2

3

4

where f(x) represents the intensity of the green color channel which is embedded with the sinusoidal fringes, w represents the width of each fringe, and x ranges from 0 to w. ( 0, R1 ðx,yÞ ¼ 255 ð2Þ R2 ðx,yÞ ¼ 9255R1 ðx,yÞ9 ¼ 255, R1 ðx,yÞ ¼ 0 where R1(x,y) is the value of red color channel for pixel (x,y) in the projected color-encoded sinusoidal pattern, while R2(x,y) is the value of red color channels for pixel (x,y) in the projected complementary color pattern. The green and blue color channels follow the same rule as Eq. (2). Totally, 32 color fringes are encoded in both of the two patterns. Here, the color coding information included in Fig. 3(a) is provided as the reference for phase unwrapping, and the complementary fringe pattern helps to eliminate the color disturbance caused by the surface color of the tested object itself. Each fringe color in the Fig. 3(a) is assigned a digital number, thus, a series of digital numbers are used to denote the different fringe colors demonstrated in the code in Table 1. Since the color combination of every four fringes included in one group is unique, the fringe order can be easily known by decoding the sequence of digital numbers in the group, for example, a color sequence of ‘‘yellow, white, green and cyan’’ can be represented by the digital sequence of ‘‘2143’’ and they represent the 3rd, 4th, 5th and 6th fringe, respectively.

4. Phase calculation The color-encode sinusoidal fringe pattern and the complementary color fringe pattern are projected onto the tested object in sequence. When the patterns are captured by the camera, two main steps of both primary phase retrieval and phase unwrapping are carried out during the image processing to retrieve the absolute phases, then the height of the tested object can be calculated. 4.1. Primary phase extraction Since the sinusoidal fringe pattern shown as Fig. 3(c) is embedded into the green color channel of the color-encoded sinusoidal fringe pattern, the Fourier Transform Method is employed to obtain the phase of the distorted sinusoidal fringes. The green color channel of the color-encoded sinusoidal fringe pattern which is projected to the tested object surface can be presented as gðx,yÞ ¼ aðx,yÞ þ bðx,yÞcos½2pf 0 þ fðx,yÞ

ð3Þ

where (x,y) is the image coordinates, a(x,y) is the background intensity, b(x,y) is the modulation amplitude, f0 is the frequency of the projected fringes, and f(x,y) is the distorted phase of the sinusoidal fringes. The one-dimensional Fourier transform of Eq. (3) is described as Gðf ,yÞ ¼ Aðf ,yÞ þQ ðf f 0 ,yÞ þ Q n ðf þf 0 ,yÞ

ð4Þ

where G(f,y) and A(f,y) are the Fourier transform of g(x,y) and a(x,y), respectively. Q(f f0,y) is the Fourier transform of (1/ 2)bðx,yÞexp½ifðx,yÞ, * denotes the conjugate operation. Since the geometric relationships among projector, CCD camera and objects are stable, we can obtain the fringe frequency of captured images in the calibration procedure. The fringe frequency is approximate to fundamental frequency. The band-pass

filter is centered with the predetermined fundamental frequency and extended to both sides of the fundamental frequency. Then use the inverse Fourier transform to retrieve the phase distribution of the distorted sinusoidal fringes. For the tangent transformation which has been used during the computing process, the value of the obtained phase ranges from  p to p. It needs to be unwrapped to calculate the surface heights of the object. It has to be pointed out that the background and the noise may affect the fundamental spectrum during the filter process; the accuracy of the basic spectrum extraction will be decreased. To solve this problem, the derivative of the phase must satisfy the following condition:   @f 2pf 0 o ð5Þ @x 3 By Fig. 1, the relationship between phase and height of the object can be described as: hðx,yÞ ¼

lDfðx,yÞ Dfðx,yÞ2pf 0 d

ð6Þ

where Dj(x,y) is the difference of the f(x,y) and the phase distribution of the reference plane which can be obtained before the measurement process. As l bhðx,yÞ, Then we can deduce that the condition of the height of the object is   @h l o ð7Þ @x 3d This is to say, when FT method is applied, the measured object should meet the above conditions to ensure a good measurement performance. 4.2. Phase unwrapping In the proposed method, phase unwrapping is carried out with reference to the color coding information, thus, it is a critical part to determine the fringe color which stands for the fringe order. We segment the captured patterns and establish the color of each pixel in the captured images and identify the fringe order by decoding. The segmentation is performed by transforming the red–green– blue channels into a binary intensity, using the iterated threshold algorithm [24]. Since the surface color of the tested object may interfere with the fringe color, we employ the color complement feature of the two projected patterns to reduce it. Exactly speaking, due to the factors of non-uniform illumination, surface reflectivity, and surface color, the threshold is different from both pixel to pixel and channel to channel, thus, a single threshold for color segmentation is not efficient here, but multi-threshold algorithm is always complex and high-cost. To solve the problem above, by projecting complementary color-encoded fringe patterns, the fringe color of the captured color-encoded sinusoidal fringe pattern is further determined by judging the values of red, green and blue channels of every two correspondent pixels in both of the two segmented patterns. It is assumed that when fringe patterns with different colors are projected onto an object, the relation between red– green–blue values of the two colors in the captured patterns maintains as the one between the colors in the projected patterns. According to this, the correct fringe color of the captured colorencoded sinusoidal fringe pattern is further determined by Eq. (8) which is applied to each color channel, respectively, ( 0, R01 ðx0 ,y0 Þ o R02 ðx0 ,y0 Þ R01 ðx0 ,y0 Þ ¼ ð8Þ 255, R01 ðx0 ,y0 Þ 4 R02 ðx0 ,y0 Þ where R0 1(x0 ,y0 )and R0 2(x0 ,y0 ) are the correspondent values of red color channel for pixel (x0 ,y0 ) in the two captured patterns. The values of green and blue color channels are determined following the same rule as Eq. (8).

F. Da et al. / Optics & Laser Technology 44 (2012) 2332–2339

Here, one additional problem should be considered that when the area is with high or low reflectivity, take the red color channel for example, R01(x0 ,y0 )¼R02(x0 ,y0 ) maybe happens. Here, this situation is solved by modifying Eq. (8) to Eq. (9), that is when R01(x0 ,y0 )¼R02 (x0 ,y0 ), R0 1(x0 ,y0 ) is assigned to 255. This is because in the projected color-encoded sinusoidal fringe pattern, the green color channel is sinusoidal and the intensity values of the pixels around the fringe edges are so low that they are below the segmentation threshold, then colors of these pixels will be possibly misjudged to 0. Actually, the values of green channel of the fringe colors in the designed pattern are 255, so the misjudged ones can be modified to 255 by Eq. (9). When the colors of each pixel have been accurately established in the final segmented color-encoded sinusoidal fringe pattern, the code words are established as well. Then, the fringe order can be found out according to the code words, the fringe order will be used at the stage of phase unwrapping. ( 0, R01 ðx0 ,y0 Þ oR02 ðx0 ,y0 Þ R01 ðx0 ,y0 Þ ¼ ð9Þ 255, R01 ðx0 ,y0 Þ ZR02 ðx0 ,y0 Þ Phase unwrapping is carried out with reference to the colorencoded fringes. The fringe order of the color-encoded fringes included in the color-encoded sinusoidal fringe pattern is distinguished after the first step is done. The phase can be unwrapped using the following mathematical expression:

Fðx,yÞ ¼ fðx,yÞ þ2pðn1Þ

ð10Þ

2335

the fringe pattern, and (m,n) is the coordinate of the fringe pattern. (Xc,Yc,Zc) is the corresponding 3D coordinate. The relationship of y  (Xc,Yc,Zc) can be described as:



a1 X c þ a2 Y c þa3 Z c þ a4 a5 X c þ a6 Y c þa7 Z c þ a8

ð11Þ

where a1 a8 are the parameters of the system, they can be determined by a sufficient set of phase values and corresponding 3D coordinate. The relationship of (m,n) (Xc,Yc,Zc) can be described as: 0 1 0 1 Xc m C B C r@ n A ¼ Ac B ð12Þ @ Yc A Zc 1 where r is the scale factor, and Ac is camera parameter matrix. 0f 1 c  mf c ctgy m0 mm n B C 1 C f ð13Þ Ac ¼ B @ 0 mcn sin y n0 A 0

0

1

These parameters in Ac are undetermined coefficients of the camera. They can be determined by the process of camera calibration [22]. Using Eqs. (11) and (12), we can transform the unwrapped phase map to obtain the corresponding 3D coordinate information of the object.

where n is the fringe order number, F(x,y) is the absolute phase.

5. Experiments and results To prove the validity of the proposed method, experiments are carried out to measure two different objects with colorful surface: a carton and a model using the measurement system described in Section 2. The objects to be measured are shown as Fig. 4. It can be seen that the two objects are with colorful surfaces, and there are no large discontinues of surface height which would affect the performance of the FT method. 5.1. Calibration Fig. 5 shows the coordinate systems of the measurement system. Ow, Oc, OP and Oc represent Reference Coordinate system, Camera coordinate system, the center point of the projection lens and the optical center. o1 mn represents image plane coordinate system. Plane OXY and projection plane are parallel. Axis Y is parallel with fringe pattern. Axis Z passes through OP. The system model contains two parts, the relationship of y  (Xc,Yc,Zc) and (m,n) (Xc,Yc,Zc), y is the phase distribution of

Fig. 4. Model and the carton.

Fig. 5. Coordinate systems of the measurement system.

2336

F. Da et al. / Optics & Laser Technology 44 (2012) 2332–2339

5.2. Measurement The color-encoded sinusoidal fringe pattern and the complementary color fringe pattern are projected onto the two tested objects in sequence. The captured fringe patterns are shown in Fig. 6 and the color segmented results of them are displayed in

Fig. 7. Fig. 7(a) and (c) show the segmented results of the captured color-encoded sinusoidal fringe patterns using the iterated threshold algorithm. It is obvious that the colors of fringes which cover the objects surface are misjudged because of the effect from surface color of the tested objects. Besides, the green color channel records the sinusoidal fringes so that the intensities of

Fig. 6. Images captured by CCD. (a) When primary color-encoded sinusoidal fringe is projected onto the carton; (b) when complementary color fringe is projected onto the carton; (c) when primary color-encoded sinusoidal fringe is projected onto the model; (d) when complementary color fringe is projected onto the model. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

Fig. 7. Color segmentation results of the captured images. (a) Result of primary color-encoded sinusoidal fringe of carton; (b) result of complementary color fringe of carton; (c) result of primary color-encoded sinusoidal fringe of model; (d) result of complementary color fringe of model. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

F. Da et al. / Optics & Laser Technology 44 (2012) 2332–2339

2337

Fig. 8. Improved color segmentation results of primary color-encoded sinusoidal fringe. (a) Result of carton with the assistance of complementary color fringe; (b) final result of carton after noise filter; (c) result of model with the assistance of complementary color fringe; (d) final result of model after noise filter. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

Fig. 9. Absolute phase distribution map and the reconstruction result. (a) Absolute phase map of carton; (b) absolute phase map of model; (c) reconstruction result of carton; (d) reconstruction result of model.

2338

F. Da et al. / Optics & Laser Technology 44 (2012) 2332–2339

Fig. 10. Experimental results of the same two objects using the reference method proposed by W. H. Su. (a) and (b) are the observed image of which is projected with the color-encoded fringe pattern onto the carton and the model, respectively; (c) and (d) are color segmentation results of the two objects using the same segmentation algorithm, respectively. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

pixels near the fringe edge are very low and below the threshold, then colors of pixels near the fringe edges are misjudged as well, On the other hand, the segmented results of the complementary color fringe pattern presented in Fig. 7(b) and (d) indicate that the misjudgment of fringe color occurs in the areas covering tested objects but the fringe edge is clear. The fringe color of the captured color-encoded sinusoidal fringe pattern is further determined by comparing the color of every two correspondent pixels in both of the two segmented color fringe pattern in red–green–blue color channels according to Eq. (3) which is given in Section 4.2 and the misjudged colors can be almost corrected. The improved segmented results are shown in Fig. 8(a) and (c). It can be clearly seen that most of the areas whose colors are misjudged have been eliminated and the remaining parts can be further removed by using a suitable filter. The final segmented results without color misjudgment are presented in Fig. 8(b) and (d), then the fringe order can be recognized for phase unwrapping. Fig. 9 shows the absolute phase distribution map and the 3-D points cloud of the reconstructed object surfaces, where the values of absolute phase have been normalized into 0 to 1. For comparison purpose, the same objects are measured using the typical method proposed in [16] based on color fringe projection which is also combined with the FT method. The experimental results are illustrated in Fig. 10. From Fig. 10(c) and (d), we can see that the surface color of the measured objects have greatly interfered with the fringe color, then the phase unwrapping process based on color coding information cannot be carried out furthermore. It has to be pointed out that if most of the object surface is with high or low reflectivity, the fringe color would become

indiscernible using the proposed method. Actually, it is a common problem for most of the existing methods based on color fringe pattern projection. Experiment results demonstrate that the proposed method is feasible to measure the colorful objects.

6. Conclusions In conclusion, the proposed method for shape measurement of colorful objects focuses on solving the problem of color interference resulting from the surface color of tested objects. The proposed approach can be applied to analyze colorful objects at the cost of projecting an additional complementary color fringe pattern which helps to further determine the fringe color. The fringe order can be easily identified by the color-encoded fringes. For the sinusoidal fringe pattern embedded into the colorencoded fringe pattern, the primary phase can be obtained quickly and precisely. Compared with the conventional methods based on color-encoded fringe projection, decoding errors can be largely reduced. Experiment results indicate that the tested colorful objects can be reconstructed by the proposed method with high robustness.

Acknowledgments This research is supported by the National Natural Science Foundation of China (Nos. 60775025 and 51175081), the Natural Science Foundation of JiangSu Province (No. BK2010058).

F. Da et al. / Optics & Laser Technology 44 (2012) 2332–2339

References [1] Mohan NK, Rastogi PK. Recent developments in interferometry for microsystems metrology. Optics and Lasers in Engineering 2009;47:199–202. [2] Zhang S. Recent progress on real-time 3D shape measurement using digital fringe projection technique. Optics and Lasers in Engineering 2010;48:149–58. [3] Su WH. Projected fringe profilometry using the area-encoded algorithm for spatially isolated and dynamic objects. Optics Express 2008;16:2590–6. [4] Saldner HO, Huntley JM. Temporal phase unwrapping:application to surface profiling of discontinuous objects. Applied Optics 1997;36:2770–5. [5] Saldner HO, Huntley JM. Temporal phase-unwrapping algorithm for automated interferogram analysis. Applied Optics 1993;32:3047–52. [6] Sansoni G, Corini S, Lazzari S, Rodella R, Docchio F. Three-dimensional imaging based on Gray-code light projection: characterization of the measuring algorithm and development of a measuring system for industrial applications. Applied Optics 1997;36:4463. [7] Zhang S, Yau ST. High-resolution, real-time 3-D absolute coordinate measurement based on a phase-shifting method. Optics Express 2006:72644–9. [8] Takeda M, Mutoh K. Fourier transform profilometry for the automatic measurement of 3-D object shapes. Applied Optics 1983;22:3977–82. [9] Su XY, Lian X. Phase unwrapping algorithm based on fringe frequency analysis in Fourier-transform profilometry. Optical Engineering 2001;40(4): 637–43. [10] Guo H, Huang PS. Absolute phase technique for the Fourier transform method. Optical Engineering 2009;48(4):043609. [11] Takeda M, Gu Q, Kinoshita M, Takai H, Takahashi Y. Frequency-multiplex Fourier-transform profilomery: a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations. Applied Optics 1997;36:5347–54.

2339

[12] Sansoni G, Carocci M, Rodella R. 3-D vision based on the combination of Gray code and phase shift light projection. Applied Optics 1999;31:6565–73. [13] Gai SY, Da FP. A novel phase-shifting method based on stripe mark. Optics and Lasers in Engineering 2010;48:205–11. [14] Dong L, Tian JD. High-resolution dynamic three-dimensional profilometry based on a combination of stereovision and color-encoded digital fringe projection. Proceedings of SPIE 2010;7855:78550J. [15] Chen HJ, Zhang J, Fang J. Surface height retrieval based on fringe shifting of color-encoded structured light pattern. Optics Letters 2008;33:1801–3. [16] Su WH. Color-encoded fringe projection for 3-D shape measurements. Optics Express 2007;15:13167–81. [17] Koninckx TP, Van Gool L. Real-time range acquisition by adaptive structured light. Pattern Analysis and Machine Intelligence 2006;28(3):432–45. [18] Sitnik R. Four-dimensional measurement by a single-frame structured light method. Applied Optics 2009;48:3344–54. [19] Zhang ZH, Towers CE, Towers DP. Shape and colour measurement of colorful objects by fringe projection. Proceedings of SPIE 2008;7063:70630N. [20] Pan JH, Huang PS, Chiang FP. Color-encoded binary fringe projection technique for 3-D shape measurement. Optical Engineering 2005;2:020363. [21] Zhang ZH, Towers Catherine E, Towers David P. Snapshot color fringe projection for absolute 3D metrology of video sequences. Applied Optics 2010;49(31):5947–53. [22] Da FP, Gai SY. Flexible three-dimensional measurement technique based on a digital light processing projector. Applied Optics 2008;47:377–85. [23] Caspi D, Kiryati N, Shamir J. Range imaging with adaptive color structured light. Pattern Analysis and Machine Intelligence 1998;5:470–80. [24] Jain AK, Yu B. Text location in image and video frames. Pattern Recognition 1998;31(12):2055–76.