3D-shape of objects with straight line-motion by simultaneous projection of color coded patterns

3D-shape of objects with straight line-motion by simultaneous projection of color coded patterns

Optics Communications 414 (2018) 185–190 Contents lists available at ScienceDirect Optics Communications journal homepage: www.elsevier.com/locate/o...

2MB Sizes 0 Downloads 51 Views

Optics Communications 414 (2018) 185–190

Contents lists available at ScienceDirect

Optics Communications journal homepage: www.elsevier.com/locate/optcom

3D-shape of objects with straight line-motion by simultaneous projection of color coded patterns Jorge L. Flores a, *, Gaston A. Ayubi b , J. Matías Di Martino b , Oscar E. Castillo a , Jose A. Ferrari b a b

Departamento de Electrónica, Universidad de Guadalajara, Av. Revolución 1500, C.P. 44840 Guadalajara, Mexico Instituto de Física, Facultad de Ingeniería (UdelaR), J. Herrera y Reissig 565, Montevideo, Uruguay

a r t i c l e

i n f o

Keywords: Instrumentation Measurement Metrology Fringe analysis Phase retrieval

a b s t r a c t In this work, we propose a novel technique to retrieve the 3D shape of dynamic objects by the simultaneous projection of a fringe pattern and a homogeneous light pattern which are both coded in two of the color channels of a RGB image. The fringe pattern, red channel, is used to retrieve the phase by phase-shift algorithms with arbitrary phase-step, while the homogeneous pattern, blue channel, is used to match pixels from the test object in consecutive images, which are acquired at different positions, and thus, to determine the speed of the object. The proposed method successfully overcomes the standard requirement of projecting fringes of two different frequencies; one frequency to extract object information and the other one to retrieve the phase. Validation experiments are presented. © 2018 Elsevier B.V. All rights reserved.

1. Introduction There are many different 2D (and 3D) applications for optical metrology systems in industrial manufacturing [1–3]. Today machine vision systems have been widely used in manufacturing and quality control, as for example, sorting parts, verifying hole locations and dimensions, and checking overall shape and fit according design requirements [2]. Only just to name a few examples that can be handled in 2-dimensions. On the other hand, with recent technological advancements on digital light projection and digital imaging, 3D optical metrology based on digital fringe projection and phase-shifting methods (PS) have improved considerably. In the last decade, one of the major research challenges has been to improve the ability to work on a wide range of surfaces with wide reflection variations, automated profile reconstruction and high-speed 3D shape measurements [2,4]. One of these applications is the 3D shape measurement of an object with straight-line motion. In general, by using phase shift algorithms, it requires to project/acquire at least three phase-shifted fringe patterns, and needs additional information to estimate the lateral displacement of the object. In this way some approaches have been proposed, e.g., Peng and coworkers [5,6] suggested the use of orthogonal two-frequency fringe patterns. In the high-frequency one, the phase-shifting direction is along the moving direction, while that the low-frequency fringes are perpendicular to the moving direction, or vice versa [6]. But, the main

idea is the same: One kind of patterns is used for phase retrieval, and the second is used to estimate the lateral displacement of the moving object using pixel matching methods. There are other proposals to estimate the lateral displacement, e.g., utilizing markers for pixel matching [7] or other strategies [1,8]. For phase retrieval from fringe projection, one must acquire a set of fringe patterns, which are described by [ ] 𝐼𝑘 (𝑥, 𝑦) = 𝑎 (𝑥, 𝑦) + 𝑏 (𝑥, 𝑦) cos 2𝜋𝑓 𝑥 + 𝜙 (𝑥, 𝑦) + 𝛿𝑘 ,

(1)

where (𝑥; 𝑦) are Cartesian coordinates, 𝑎 (𝑥, 𝑦) is the background illumination, 𝑏 (𝑥, 𝑦) is the amplitude modulation, f is the spatial frequency of the fringe on the reference plane, 𝜙 (𝑥, 𝑦) is a phase related to the profile of the measured object, and 𝛿𝑘 is the phase shift: Usually by 3D-profiling of static object the phase step 𝛿 is known a priori, but when the test object is moving often one has to estimate this quantity a posteriori. In general, for N -step phase-shifting algorithms the phase could be retrieved from N intensity patterns as the arctangent of the ratio between two linear combinations of values 𝐼𝑘 (𝑥, 𝑦), as for example [9–11] { ∑𝑁 } 𝑘=1 𝐼𝑘 (𝑥, 𝑦) sin 𝛿𝑘 𝜙 (𝑥, 𝑦) = arctan ∑𝑁 . (2) 𝑘=1 𝐼𝑘 (𝑥, 𝑦) cos 𝛿𝑘 Expression (2) presupposes that the phase steps 𝛿𝑘 are known and evenly spaced in the interval [0; 2𝜋), e.g., 𝛿𝑘 = (𝑘 − 1) 2𝜋∕𝑁. In fact, Hoang

* Corresponding author.

E-mail address: [email protected] (J.L. Flores). https://doi.org/10.1016/j.optcom.2017.12.087 Received 18 September 2017; Received in revised form 27 December 2017; Accepted 30 December 2017 Available online 6 February 2018 0030-4018/© 2018 Elsevier B.V. All rights reserved.

J.L. Flores et al.

Optics Communications 414 (2018) 185–190

Fig. 1. Experimental setup.

into the red (R) and blue (B) channels of a color image, respectively. This color image is then projected by a LCD or DLP projector onto the surface of the moving object under measurement and simultaneously an image sequence is acquired by a color camera. And then, two set of gray level images of the moving object are extracted (recovered from the Red and Blue channels) from the acquired sequence. In the detected red channel we acquire the modulated fringe patterns, which are used to estimate depth of the object under test by the use of PS algorithms. The blue channel consists of a series photographs of the object as it moves along x-direction, which are used for tracking the object frame by frame. A detailed description of both, the phase recovery process and object tracking is illustrated in Fig. 2 and developed below. In the experimental arrangement, the camera is always static, so the first problem is to find the position of the object into each frame. The target for the correlation filter is a photograph of the object (blue channel information), which is given by a window centered on the object and extracted from the first frame. In the rest of photographs, the target (i.e., the test object) is tracked during its movement by a correlating filter over a search window in the next frames; the location corresponding to the maximum value in the correlation output indicates the new position of the target into a given frame. Thus, the object position is updated, and a new search is then performed based on that new location in the following frame. To create an optimal tracker, target and the rest of frames are filtered to enhance features for shape characterization of the object. Enhancement process were obtained by employing Sobel filter for edges detection. The correlation for tracking the target in an image sequence is given by

et al. [12] suggested that Eq. (2) can be used in a (𝑝 + 2) step uniform phase-shifting scheme to the retrieve phase accurately from fringe patterns with nonlinear harmonics up to the p th order. In dynamic 3D profiling, however, this requirement is often difficult to meet exactly because the effective phase steps are determined by the velocity of the motion of the linear travel stage, which is not necessarily uniform, even though N deformed patterns be captured in a constant rate, the phase steps will not evenly spaced. On other hand, the algorithms used to determine the phase steps from the experimentally obtained intensity patterns present some errors (see, e.g., [13,14]). Thus, the actual (measured) phase steps will not coincide with their nominal values for a given N -sample algorithm, and not be evenly spaced. Considering phaseshift values are not evenly spaced and/or they present small errors, the use of Eq. (2) is not the best method for phase retrieval, although this has been proposed in [5,7]. Something similar happen with the procedure proposed in [1,6,8] to retrieve the phase, which is based on a wellknown five-sample Stoilov’s algorithm [15]. It has the peculiarity of being a tunable algorithm, i.e. it does not require to know the phase steps, but presents high sensitive to phase-shift errors. To eliminate this inconvenience, Yang Li et al. [16] proposed a three-dimensional online measurement method based on a five-step algorithm with unequal phase shift. However, we found that this algorithm presents some inconsistencies, which will be described below. In this work, we propose a novel method for 3D-profiling of moving objects by the simultaneous projection of a fringe pattern and a homogeneous light pattern, which are encoded in two of the color channels of a RGB image. The fringe pattern, red channel, is used to retrieve the phase, while the homogeneous pattern, codified in the blue channel, is used to match pixels from the test object in consecutive images. On the basis that one of the problems is that intensity fringe patterns may not be equal spaced, we proposed the use of the PS algorithm with N arbitrarily spaced phase-steps for phase retrieval [10,11]. Particularly, we are proposing the use of the generalized phase-shifting algorithms for N -step (N -sample GPSA) described in [11] that have shown good minimization of phase noise occurred by phase-shift miscalibration and Gaussian noise.

𝑔 (𝑥, 𝑦) = 𝐼𝐵,𝑘 (𝑥, 𝑦) ⊗ ℎ (𝑥, 𝑦) ,

(3)

where ⊗ represents the symbol of correlation, and ℎ (𝑥, 𝑦) and 𝐼𝐵,𝑘 (𝑥, 𝑦) are the correlation filter associated with target and the filtered kth-frame of a given sequence, which is extracted from B channel, respectively. The correlation operation can be implemented in Fourier domain [17]. From the correlation output, 𝑔 (𝑥, 𝑦) one can determine the position of the object into kth frame or the object-shift 𝛿. For phase retrieval, a static sinusoidal fringe pattern is projected onto a test object, which ideally is moving along the x-direction with a given constant velocity, v. Thus, the camera acquires a stationary signal modulated by the height of the object. Therefore, the intensity in R channel of N color images, which are acquired as a sequence (video), can be expressed as

2. Description of the method The setup for online 3D-shape measurement of moving objects is shown in Fig. 1. It consists of a projector, which is used to project a RGB software-generated fringe pattern, and a high-resolution (CCD) camera which is employed to capture a video wherein the fringe pattern is modulated by the object’s surface. The test object is on a platform, which is moving along the x-axis direction with approximated constant velocity. Moreover, the platform can be replaced for a conveyor belt; therefore, the proposal method can be used for industrial applications. Firstly, one sinusoidal fringe pattern (with fringes in x-direction, and arbitrary origin) and a uniform background (256 gray level) are encoded

𝐼𝑘 (𝑥, 𝑦) = 𝑎 (𝑥, 𝑦) + 𝑏 (𝑥, 𝑦) cos [2𝜋𝑓 𝑥 + 𝜙 (𝑥 − 𝑘𝛿, 𝑦)] ,

(4)

where, 𝜙 (𝑥 − 𝑘𝛿, 𝑦) is a phase change related to the profile of the test object, 𝛿 is the object-shift for a given frame, and 𝑘 = 1, 2, … , 𝑁. The displacement of the object between consecutive frames is given by 𝛿 = 𝑣𝛥𝑡 where 𝛥𝑡 is the time between consecutive frames, i.e. 𝛥𝑡 is the inverse to frame rate, 𝑓𝑠 . Under this assumption, we could build a set 186

J.L. Flores et al.

Optics Communications 414 (2018) 185–190

Fig. 2. Schematic diagram for phase retrieval were the input color image is separated in red, used for shape retrieval, and blue used for the correlation and tracking algorithms. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

of phase-shifted sinusoidal patterns taken from the video frames and by segmentation extract a window centered at a specific point of the object previously determine by tracking algorithm. This operation of segmentation and extraction of a window centered on the moving object is equivalent to a change of coordinates𝑥 → 𝑥 + 𝑘𝛿, i.e. a translation of the origin of coordinates in Eq. (4)

pattern with a sinusoidal intensity profile and with a pitch of 32 pixels, in blue (B) channel we encoded a uniform background, i.e., a gray level of 255, and in the green (G) channel is set to zero to avoid crosstalk problem. Then, it was projected by a commercial DLP projector (model PJD7820, ViewSonic) with 1920 × 1280 pixels. And a RGB video was acquired at rate of 4 frame/s, using an 8-bit single-CCD camera (model DCU224C, Thorlabs) with 1280 × 1024 pixel, under a viewing angle of 𝜃 ≈ 16◦ , see Fig. 1. In the experiment the test object moves at a velocity of 0.7 cm/s. From the video we extracted N frames and then each one was split in two sequences of images, one with a fringe patterns, red channel, and other one with images of the object under test, blue channel. This last set of photographs are used to estimate lateral displacement of the object using a pixel matching method described previously. Once the object position across each image has been located, the next step is the determination of a window to isolate the modulated region from the rest of scene (fringe pattern). Fig. 3(a) shows one frame extracted from the acquired video, where R channel is the sinusoidal fringe pattern modulated by the object under test (in gray levels), and B channel corresponds to the image of the object (gray levels), see Fig. 3(b) and (d), respectively. While, Fig. 3(c) exhibits the image in the green channel, in which, one can see the low-contrast fringe pattern due to cross-talk problem intrinsic to CCD camera. In Fig. 4, we show five fringe patterns extracted from the acquired video, see movie1. In these images we can see that the object is located at same position, while the fringe of consecutive patterns are phaseshifted. As we do not know a priori the actual phase-shift values between consecutive frames, we can use the algorithm proposed in [13] or [14] to estimate them. From the acquired video we extracted a least 15 frames, like shown in Fig. 4. The segmentation of each image from the video took 0.11 s using Matlab and a CPU (AMD A-10-5750M @2.5 GHz y 12 GB RAM). The processing speed of tracking and segmentation procedures could be improved considerably if these new technologies like the computer unified device architecture (CUDA) for the graphics processor unit (GPU) are used, which allows parallel computing and drastically reduce the computational cost. Then, from a set of fringe patterns and using the Guo and Zhang algorithm [14] we estimated the phase steps, whose values are 𝛿𝑘 = [0, 1.280, 2.553, 4.032, …]. It can be seen that phase steps

𝐼𝑘 (𝑥 + 𝑘𝛿, 𝑦) = 𝑎 (𝑥 + 𝑘𝛿, 𝑦) + 𝑏 (𝑥 + 𝑘𝛿, 𝑦) cos [2𝜋𝑓 𝑥 + 2𝜋𝑓 𝑘𝛿 + 𝜙 (𝑥, 𝑦)] ,

(5)

so that the 𝜙 (𝑥, 𝑦) is now fixed for all the extracted frames and the fringe patterns are shifted backwards by 𝑘𝛿 i.e., 𝐼𝑘 (𝑥 + 𝑘𝛿, 𝑦). Thus, the following frames should be shifted by a multiple of 𝛿, i.e. now we have extracted a set of (equivalent) phase-shifted deformed patterns. Assuming that 𝑎 (𝑥, 𝑦) and 𝑏 (𝑥, 𝑦) are spatially uniform across the image, we can rewrite Eq. (5) as 𝐼𝑘 (𝑥 + 𝑘𝛿, 𝑦) = 𝑎 (𝑥, 𝑦) + 𝑏 (𝑥, 𝑦) cos [2𝜋𝑓 𝑥 + 𝜙 (𝑥, 𝑦) + 2𝜋𝑓 𝑘𝛿] ,

(6)

where (𝑥, 𝑦) lies in a window centered in the test object, 𝑓 𝛿 = 𝑣𝛥𝑡∕𝑇 , where T is the pitch of fringe pattern projected on the reference plane. According to Eq. (6), for phase retrieval one can use any phase-shift (PS) algorithm for N evenly spaced steps, e.g. Stoilov’s algorithm [15] as proposed in [1,6,8]. For that we need to achieve an accuracy control in the motion of the linear travel stage, and make a previous calibration between the acquisition time and velocity of the travel platform, to obtain a uniform phase shift. On the contrary, the captured patterns by CCD will not be equally spaced. Thus, in the presence of a phaseshift miscalibration, a PS algorithms with uniform phase-step could fail. To overcome not only problems due of phase-shift descalibracion but also to reduce the effects of noise, we proposed the use of the generalized algorithm (GPSA) with N arbitrarily spaced phase-steps described in [11]. 3. Experimental results To verify the effectiveness of the proposed method, the shape of a semi-sphere, with 99.669 mm diameter and 54.61 mm of height, mounted on a reference plate is measured. For the experiments we generated one color image: In red (R) channel we encoded a fringe 187

J.L. Flores et al.

Optics Communications 414 (2018) 185–190

Fig. 5. Reconstructed 3D-shape profile of the plaster mask in shade of gray: (a)–(c) from 5, 6, and 13 fringe patterns using the N - sample GPSA described in Appendix A, respectively. And (d) from 5 patterns and using Eq. (18), Ref. [10].

Fig. 3. (a) A color frame extracted from video; (b)–(d) in gray levels corresponds to R, G and B channels. (b) represents a deformed fringe pattern information, (c) low-contrast fringe pattern due of the cross talk between G and B channel and (d) an object photograph.

In other to perform a comparison, we implemented the five phase-step algorithm described by Eq. (18) in [10]. Fig. 5(d) depicts the result given by this 5 phase-step algorithm. Note: Y. Li et al. version of the five-sample algorithm with arbitrary phase steps presents a typographic error and also it has been erroneously deduced. For this reason we did not compared our algorithm with that proposed by Li [16]. Fig. 6 shows the cross sections of the height distribution depicted in the Fig. 5(a)–(d). For the phase to height conversion we used the method proposed in [19]. In Fig. 6 one can see clearly the difference between the five-sample algorithm, which is robust only to phase-shift miscalibration (cyan line), and the N -sample GPSA [11]: blue line for 𝑁 = 13 and green line for 𝑁 = 6. Inclusive for case of 𝑁 = 5 in red line. In Fig. 6, the black line corresponds to best fit to the height distribution retrieved with 13 fringe patterns. By inspection we observe in Fig. 5(d) that the maximum errors occur in that regions near to the edges of the sphere, where the fringe patterns have a less contrast (see Fig. 4) and therefore a less signal to noise ratio. The phase errors are less evident in Fig. 5(a), in visual terms. In order to show the height distribution, Fig. 7 depicts horizontal cut of height error, which has been calculated respect to best fit, i.e. black line in Fig. 6. We observe in Fig. 7 that the minimum error corresponds to the case of 𝑁 = 13 frames. In static 3D reconstruction by fringe pattern projection, the phase retrieval can be very simple and accurate procedure using phase shift algorithms, which are described by Eq. (2), with a large number frames. Inclusive the effects of harmonics on retrieved phase can be reduced as number of phase shift increases, see Ref. [12]. In case of dynamic 3D reconstruction of moving object under the assumption of free trapezoidal distortion caused by camera or the projector, the PS algorithms will present higher accuracy when the object only moves a short distance. More specifically, when N arbitrary frames of deformed patterns are acquired by CCD as the measured object moves within a duty cycle of the sinusoidal pattern, around to the center of point of view. Similarly, using GPSAs with a large number of frame, it also is possible to reduce the effect of harmonics and noise in the retrieved phase. In case that the acquired fringe patterns present trapezoidal distortion, we need to applied correction image processing, for example, the approach proposed by Han Yuan et al. [20].

Fig. 4. Five patterns extracted from the video.

are not evenly spaced in the interval of 0 and 2𝜋, thus these phase-shifts do not correspond with any value of 2𝜋/N. Using Eqs. (A.1)–(A.3) we implemented the 𝑁-sample GPSA [11] as wrapping-free algorithm, according to [18] for 𝑁 = 5, 6 and 13. The first two with an average phase shift of 1.0607 rad and the third with an average phase shift of 0.46 rad. One advantages of implementation in wrapping-free is that we do not need a reference plane. Applying this algorithm for 5, 6 and 13 consecutive fringe patterns, one obtains the three-dimensional shape profile shown in Fig. 5(a)–(c), respectively. 188

J.L. Flores et al.

Optics Communications 414 (2018) 185–190

Acknowledgments The authors thank the financial support PF.068.17 from PEDECIBA (Uruguay) and the financial support 122 from Comisión Sectorial de Investigación Científica (CSIC, UdelaR, Uruguay). Appendix A In general, the algorithms for the phase recovering published in the literature can be written as a quotient of linear combinations of the 𝐼𝑚 , i.e. ∑𝑀 𝑚=1 𝑏𝑚 𝐼𝑚 tan(𝜙) = ∑𝑀 , (A.1) 𝑚=1 𝑎𝑚 𝐼𝑚

Fig. 6. Horizontal cut of height profile: Red, green, blue and cyan line corresponding to 3D reconstruction shown in Fig. 5(a)–(d) respectively. While, black line corresponds to ideal sphere. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

with given (real) coefficients 𝑎𝑚 and 𝑏𝑚 . Expression (A.1) presupposes that the phase-steps are known. Under some consideration, which are described in [11], the coefficients 𝑎𝑚 and 𝑏𝑚 appearing in (A.1) could be determine from the estimated phase steps as 𝑎𝑚 = 𝜆1 + 𝜆3 cos 𝛿𝑚 − 𝜆4 sin 𝛿𝑚 + 𝜆5 cos 𝛿𝑚 + 𝜆6 sin 𝛿𝑚

(A.2)

and 𝑏𝑚 = 𝜆2 + 𝜆3 sin 𝛿𝑚 − 𝜆4 cos 𝛿𝑚 − 𝜆5 sin 𝛿𝑚 + 𝜆6 cos 𝛿𝑚

(A.3)

where, 𝜆1 , 𝜆2 … 𝜆6 can be calculated following the procedure described in [11], which can be described in synthesized form: For a given set of 𝛿𝑚 , from Eqs. (41)–(44) in Ref. [11] one calculates the coefficients 𝐾1−4 , and then, using these coefficients in Eqs. (35)–(39) one obtains 𝜆1−6 . Thus, we can calculate the optimal coefficients 𝑎𝑚 and 𝑏𝑚 from (A.2) and (A.3), which will be used to retrieve the phase according to Eq. (A.1). Appendix B. Supplementary data Supplementary material related to this article can be found online at https://doi.org/10.1016/j.optcom.2017.12.087. References Fig. 7. Horizontal cut of height error calculated with respect to ideal sphere (black line in Fig. 6): Red, green, blue and cyan line corresponding to 3D reconstruction shown in Fig. 5(a)–(d), respectively. Blue line corresponds to the minimum difference obtained by using the larger number of frames. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

[1] X. Xu, Y. Cao, C. Chen, Y. Wan, On-line phase measuring profilometry based on phase matching, Opt. Quantum Electron. 48 (8) (2016) 411. [2] K. Harding, 3D profilometry: next requests from the industrial view point, in: Proc. SPIE, vol. 7855, 2010, p. 785513. [3] S.S. Gorthi, P. Rastogi, Fringe projection techniques: whither we are?, Opt. Laser. Eng. 48 (2010) 133–140. [4] K. Zhong, Z. Li, X. Zhou, Y. Li, Y. Shi, C. Wang, Enhanced phase measurement profilometry for industrial 3D inspection automation, Int. J. Adv. Manufactur. Technol. 76 (9–12) (2015) 1563–1574. [5] K. Peng, Y. Cao, Y. Wu, M. Lu, A new method using orthogonal two-frequency grating in online 3D measurement, Opt. Laser Technol. 83 (2016) 81–88. [6] K. Peng, Y. Cao, Y. Wu, C. Chen, Y. Wan, A dual-frequency online PMP method with phase-shifting parallel to moving direction of measured object, Opt. Commun. 383 (2017) 491–499. [7] C. Chen, Y.P. Cao, L.J. Zhong, K. Peng, An on-line phase measuring profilometry for objects moving with straight-line motion, Opt. Commun. 336 (2015) 301–305. [8] K. Peng, Y. Cao, Y. Wu, Y. Xiao, A new pixel matching method using the modulation of shadow areas in online 3D measurement, Opt. Lasers Eng. 51 (9) (2013) 1078– 1084. [9] Y. Surrel, Design of algorithms for phase measurements by the use of phase stepping, Appl. Opt. 35 (1) (1996) 51–60. [10] G.A. Ayubi, C.D. Perciante, J.L. Flores, J.M. Di Martino, J.A. Ferrari, Generation of phase-shifting algorithms with n arbitrarily spaced phase-steps, Appl. Opt. 53 (30) (2014) 7168–7176. [11] G.A. Ayubi, C.D. Perciante, J.M. Di Martino, J.L. Flores, J.A. Ferrari, Generalized phase-shifting algorithms: error analysis and minimization of noise propagation, Appl. Opt. 55 (6) (2016) 1461–1469 (see Erratum: 55(28) 7763 (2016)). [12] T. Hoang, B. Pan, D. Nguyen, Z. Wang, Generic gamma correction for accuracy enhancement in fringe-projection profilometry, Opt. Lett. 35 (12) (2010) 1992– 1994. [13] T. Li, D. Hong, The vector projection of normalized interferogram differences and its application in phase-shift estimation, J. Modern Opt. 63 (8) (2016) 804–808. [14] H. Guo, Z. Zhang, Phase shift estimation from variances of fringe pattern differences, Appl. Opt. 52 (2013) 6572–6578.

4. Conclusion We proposed a novel technique to retrieve the 3D shape of dynamic objects by the simultaneous projection of a fringe pattern and a homogeneous light pattern which are both coded in two of the color channels of a RGB image. This approach allows an automated profile reconstruction without adding extra marks on the conveyor system or sample. We have shown that the use of N -sample GPSA in applications with objects in motion, overcomes the necessity of efficient control in the velocity of the linear travel stage, and makes an unnecessary a calibration between the acquisition time and velocity of the travel platform in order to obtain a uniform phase shift. In other words, we shown that we can retrieve the phase without any previous calibration between a given commercial projector/camera system and the travel platform. We only require a sequence of fringe patterns extracted from a video. Considering the simplicity of the proposed method, it is potentially useful for dynamic measurements and real-time applications. We can even implement the N -sample GPSA as a wrapping-free algorithm, as is described in [18], which would significantly reduce the computational cost (in comparison to the cost of using an unwrapping algorithm. 189

J.L. Flores et al.

Optics Communications 414 (2018) 185–190 [18] C.D. Perciante, M. Strojnik, G. Paez, J.M. Di Martino, G.A. Ayubi, J.L. Flores, J.A. Ferrari, Wrapping-free phase retrieval with applications to interferometry, 3D-shape profiling, and deflectometry, Appl. Opt. 54 (10) (2015) 3018–3023. [19] H. Yuan, Y.P. Cao, C. Chen, Y.P. Wang, Online phase measuring profilometry for rectilinear moving object by image correction, Opt. Eng. 54 (11) (2015) 113104. [20] P. Jia, J. Kofman, C. English, Comparison of linear and nonlinear calibration methods for phase-measuring profilometry, Opt. Eng. 46 (4) (2017) 043601.

[15] G. Stoilov, T. Dragostinov, Phase-stepping interferometry: five-frame algorithm with an arbitrary step, Opt. Lasers Eng. 28 (1) (1997) 61–69. [16] Y. Li, Y.P. Cao, Z.F. Huang, D.L. Chen, S.P. Shi, A three dimensional on-line measurement method based on five unequal steps phase shifting, Opt. Commun. 285 (21) (2012) 4285–4289. [17] R.E. Guerrero-Moreno, J. Álvarez Borrego, Nonlinear composite filter performance, Opt. Eng. 48 (6) (2009) 067201.

190