High-accuracy high-speed unconstrained fringe projection profilometry of 3D measurement

High-accuracy high-speed unconstrained fringe projection profilometry of 3D measurement

Optics and Laser Technology 125 (2020) 106063 Contents lists available at ScienceDirect Optics and Laser Technology journal homepage: www.elsevier.c...

3MB Sizes 0 Downloads 25 Views

Optics and Laser Technology 125 (2020) 106063

Contents lists available at ScienceDirect

Optics and Laser Technology journal homepage: www.elsevier.com/locate/optlastec

High-accuracy high-speed unconstrained fringe projection profilometry of 3D measurement Shichao Yang, Gaoxu Wu, Yanxue Wu, Jin Yan, Huifang Luo, Yinnan Zhang, Fei Liu

T



State Key Laboratory of Mechanical Transmission, Chongqing University, Chongqing, China College of Mechanical Engineering, Chongqing University, Chongqing, China

H I GH L IG H T

achieved high speed, high accuracy and unconstrained in 3D measurement simultaneously. • We the proposed method, the high-frequency fringe patterns (70 periods) and fewer (only four) patterns are used. • InIt works well in measuring complex and isolated objects without any pre-work or constraints. •

A R T I C LE I N FO

A B S T R A C T

Keywords: 3D measurement Fringe projection profilometry One unwrapping pattern

Fringe projection profilometry (FPP) has been extensively used to measure three-dimensional (3D) shape in recent years. Generally, high-frequency fringe patterns ensure high accuracy, while fewer patterns mean higher measuring speed in FPP. However, simultaneously meeting the high-frequency and few-patterns without any constraints are still big challenges, because several additional unwrapping patterns are needed to unwrap the high-frequency patterns. We propose a novel high-accuracy high-speed unconstrained method to obtain the 3D shape by projecting only four fringe patterns. The period of the fringe patterns in the proposed method reaches 70. The high frequency ensures the high measuring accuracy. Besides, only one additional pattern is used to unwrap the high-frequency patterns, ensuring the high measuring speed. This method calculates the absolute phase pixel by pixel independently without any pre-work or constraints, which is suitable for complex and isolated objects. The experimental results verify the high-speed and high-accuracy performance of our method.

1. Introduction 3D shape measurement has been extensively applied in the fields of medical engineering, computer vision, biotechnology and industrial production [1–5]. Fringe projection profilometry (FPP), one of the most used 3D shape measurement techniques, is a publicly recognized approach [6–12]. One FPP system mainly includes a digital projector, a camera, and a computer. The projector projects several periodical fringe patterns onto the object, and the camera captures the deformed patterns. With the relations between the projector and the camera, the 3D coordinates of the object can be obtained [13,14]. Generally, highfrequency fringe patterns ensure high accuracy while a small number of patterns means high measuring speed in FPP. However, there is a contradiction when one wants to achieve high speed (few pattern) and high accuracy (high fringe pattern frequency) measurement simultaneously. If the high-frequency patterns are selected, several additional unwrapping patterns are indispensable to unwrap the high-frequency ⁎

patterns, which increases the number of patterns. If one chooses fewer patterns, only low-frequency patterns can be successfully applied, which cannot ensure the high accuracy. Therefore, meeting the highaccuracy, high-speed and unconstrained simultaneously still faces great challenges in 3D shape measurement. In general, FPP takes at least three high-frequency patterns to acquire the high-accuracy wrapped phase [15–21]. To unwrap the highfrequency fringe patterns, multiple additional fringe patterns are needed in the temporal unwrapping phase approach, which decreases the measuring speed [22–25]. There are usually two ways to improve the measuring speed. In the first technique, a multi-camera system is used to obtain precise fringe orders with geometry constraint [26]. However, it’s hard to set proper measurement volume and accurate depth constraint to keep the robustness of choosing the correct correspondence. The second technique is to decrease the number of additional unwrapping patterns in the single-camera system [22,27]. Zheng et al. presented a Gray code-based phase unwrapping method by

Corresponding author at: Room 7520, Teaching Building 7, Chongqing University, Chongqing 400044, China. E-mail address: [email protected] (F. Liu).

https://doi.org/10.1016/j.optlastec.2020.106063 Received 27 August 2019; Received in revised form 26 November 2019; Accepted 8 January 2020 0030-3992/ © 2020 Elsevier Ltd. All rights reserved.

Optics and Laser Technology 125 (2020) 106063

S. Yang, et al.

projecting at least five additional patterns [28]. The number of the additional unwrapping pattern is ⌈log 2 f ⌉ in the traditional Gray-code method to differentiate f periods, where ⌈·⌉ is a ceiling function [29]. Their method used ⌈log3 f ⌉ patterns of three gray-level (white, gray, black), which reduced the number of patterns. Zhang et al. proposed a three-additional fringe patterns method to unwrap phase by directly projecting the fringe order patterns onto the object [30]. However, it was hard to reconstruct high-accuracy 3D shape because the period of fringe patterns was only 10 which was relatively a low-frequency. Ma et al. proposed a morphology phase unwrapping method with one additional fringe pattern [31]. Although this method decreased the number of the patterns, the 36 fringe periods was not enough to ensure the high accuracy. An et al. developed a pixel-wise absolute phase unwrapping method using geometric constraints which avoided projecting additional patterns [32]. However, strict geometrical constrains always failed to measure the objects with large depth range. Only 19% of sensing range of the camera could be used, which was not applicable in most situations. To simultaneously achieve high frequency and few patterns in FPP, we propose a novel high-accuracy high-speed unconstrained method to acquire the 3D shape. Our method projects only four patterns in total, including three general high-frequency patterns and one additional unwrapping pattern. We can acquire the wrapped phase and the fringe order pixel-by-pixel without any constraints. Because only one additional unwrapping pattern is projected to unwrap phase, the proposed method can greatly improve the measuring speed.

written as:

t (k ) = [k (x , y ) × 2π ] (N + 1),

(5)

where N represents the number of fringe periods and k (x , y ) represents the order of fringe ranging from 1 to N, which can be expressed as k (x , y ) = ceil (x h) . h is the number of pixels in per fringe period. The function ceil () is an up-round function. Because t (k ) in the Eq. (4) is different at different fringe order k (x , y ) we can use it to unwrap the wrapped phase. The principle of the proposed method is shown in Fig. 2. As shown in the Fig. 2, I1 , I2 and I3 are the conventional three-step phase-shifting high-frequency patterns which are used to acquire the wrapped phase. With the fourth pattern (I4 ), we can get the fringe order. Finally, we retrieve the absolute phase from the wrapped phase and the fringe order. 2.1. Three-step phase-shifting algorithm acquiring wrapped phase The wrapped phase φ (x , y ) can be calculated from I1 (x , y ) , I2 (x , y ) and I3 (x , y ) , shown as follows:

3 [I3 (x , y ) − I1 (x , y )] ⎫ φ (x , y ) = tan−1 ⎧ , ⎨ 2 I ( x ⎩ 2 , y ) − I1 (x , y ) − I3 (x , y ) ⎬ ⎭

(6)

the range of wrapped phase φ (x , y ) is from 0 to 2π in one period shown in Fig. 2(b). The absolute phase shown in Fig. 2(d) can be calculated by the wrapped phase and fringe order as follows:

ϕ (x , y ) = φ (x , y ) + 2π × k (x , y ),

2. Principle of the method

(7)

where ϕ (x , y ) is the absolute phase, and k (x , y ) is the fringe order from 1 to N.

We set up four patterns to promote the high-accuracy high-speed unconstrained method. The first three patterns are conventional highfrequency patterns to ensure the high accuracy, while the fourth pattern unwraps the high-frequency patterns and increases the measuring speed. The four patterns are shown in Fig. 1. The generated four patterns are expressed as follows:

2.2. One fringe pattern decoding fringe orders

2π ⎤, I1 (x , y ) = A + B sin ⎡ϕ (x , y ) − 3 ⎦ ⎣

(1)

The proposed method uses only one additional pattern I4 (x , y ) , which has a special relation with the other three patterns to obtain the fringe orders. The relation among these four patterns can be described as

I2 (x , y ) = A + B sin[ϕ (x , y )],

(2)

K1k × I1 (x , y ) + K2k × I2 (x , y ) + K3k × I3 (x , y ) + K 4k × I4 (x , y ) = 0,

2π ⎤, I3 (x , y ) = A + B sin ⎡ϕ (x , y ) + 3 ⎦ ⎣

(3)

I4 (x , y ) = A + B sin[ϕ (x , y ) + t (k )],

(4)

(8)

where Knk represents different coefficients (n = 1, 2, 3, 4) with the kth fringe order. We expand the trigonometric functions of Eqs. (1)–(4), substitute the results into Eq. (8), and the following expression can be obtained: k k ⎧− K1 + K k − K3 + cos[t (k )] K k ⎫ B 2 4 ⎬ ⎨ 2 2 ⎭ ⎩

where In (x , y ) represents the intensity of point (x , y ) in the nth patterns (n = 1, 2, 3, 4) . I1 (x , y ) , I2 (x , y ) and I3 (x , y ) are the conventional threestep phase-shifting high-frequency patterns. I4 (x , y ) is used to unwrap the phase. A and B are the average intensity and modulated intensity, respectively. ϕ (x , y ) represents the absolute phase. In the proposed method, t (k ) is coded along with the order of fringe periods. It is

3 K3k 3 K1k + + sin[t (k )] K 4k ⎫ B sin(ϕ) + ⎧− ⎬ ⎨ 2 2 ⎭ ⎩ k k k k cos(ϕ) + (K1 + K2 + K3 + K 4 ) A = 0.

Fig. 1. Four projected patterns. (a) Three conventional high-frequency patterns. (b) One unwrapping pattern. 2

(9)

Optics and Laser Technology 125 (2020) 106063

S. Yang, et al.

Fig. 2. Principle of the method. (a and b) One cross section of four patterns. (c) One cross section of the wrapped phase which is retrieved from the three-step phaseshifting patterns (I1 , I2 , I3 ). (d) One cross section of the fringe order which is retrieved from the four patterns. (e) One cross section of the absolute phase which is retrieved from the wrapped phase and the fringe order.

closest to 0 (the right part of Eq. (13)) in the 70 Tk s and its index k presents the fringe order. For example, for a given point (x , y ) , if T6 is closest to 0, the fringe order k (x , y ) is 6. After that, we obtain the absolute phase with Eq. (7).

Since Eq. (9) should satisfy all the absolute phase ϕ , and A, B are positive constants, the coefficients of sin(ϕ) , cos(ϕ) and A should be zeros. Therefore, we can get Eq. (10) as follows:

⎧− ⎪

K1k 2

+ K2k −

3 Kk

K3k 2

+ cos[t (k )] K 4k = 0 3. Experiments

3 Kk

k 3 1 ⎨− 2 + 2 + sin[t (k )] K 4 = 0, ⎪ k k k k ⎩ K1 + K2 + K3 + K 4 = 0

In order to demonstrate the performance of this novel method, we constructed a FPP system and conducted some experiments. The FPP system contains a camera (AVT Manta G-505B, maximum measure speed 15 fps, 2456 × 2056 resolution), a projector (Acer H7850, 1024 × 768 resolution) and a computer. The distance between the projector and the camera was about 350 mm. The projector and the camera were placed about 800 mm in front of the objects. Our FPP system has no constraints and can employ almost all the sensing range of the camera. In these experiments, we generated four high-frequency patterns with 70 fringe periods, projected the four patterns onto the object and captured the deformed patterns by the camera. Firstly, we compared the accuracy of the proposed method with the widely used conventional 3-frequency 4-step algorithm which needs 12 fringe patterns [33]. In this experiment, we made two comparisons. In the first comparative trial, we set the absolute phase map of the 3frequency 48-step algorithm as a standard reference to compare the accuracy of the conventional method and the proposed method [34]. We projected the patterns of three methods onto a flat. Then we extracted the same area in the absolute phase map of the conventional method and the proposed method. After comparing the two phase maps with standard reference which was received from the 3-frequency 48step algorithm, we finally obtained the phase errors of the two methods. The process is shown in Fig. 3. Fig. 3(a) shows one fringe pattern captured by the camera. Fig. 3(b) shows one cross section of the wrapped phase and the fringe order. Fig. 3(c) shows the result of 3D reconstruction of the proposed method in Geomagic Studio 2012. We extracted a 500 × 500 pixels area to analyze the errors of two methods. The absolute phase maps of two methods in the same area are shown in Fig. 4. Meanwhile, we acquire the absolute phase error map of two methods by subtracting the standard reference, Fig. 5. The root means square error (RMSE) of the conventional method is 0.0259 rad, Fig. 5(a). The RMSE of the proposed method is 0.0258 rad shown in Fig. 5(b). Fig. 6 shows phase errors of one cross section of two methods. We can see that the errors of the two methods are very close. This experiment proves the proposed method can meet the accuracy well by projecting only one additional

(10)

Changing its form into

{ { {

}

⎛ K1k = cos[t (k )] − 1 + sin[t (k )] × K 4k 3 3 ⎜ 1 2 cos[t (k )] × K 4k ⎜ K2k = − 3 − 3 ⎜ cos[t (k )] − 1 sin[t (k )] k − × K 4k , ⎜ K3 = 3 3 ⎜ k k ⎝ K4 = K4

}

}

(11)

Knk (n

= 1, 2, 3, 4) in where t (k ) is determined by Eq. (5). Substituting the Eq. (11) into Eq. (8), and it turns to be an equation about t (k ) and K 4k

{

cos[t (k )] − 1 3

+

{

+

cos[t (k )] − 1 3

sin[t (k )] 3



} × I (x, y) + {− − } × I (x, y) + I (x, y) 1 3

1

sin[t (k )] 3

3

×

K 4k

2 cos[t (k )] 3

} × I (x, y) 2

4

= 0.

(12)

Because Eq. (12) should always be true regardless of the value of K 4k , the Eq. (12) turns to

⎧ cos[t (k )] − 1 + sin[t (k )] ⎫ × I1 (x , y ) + ⎧− 1 − 2 cos[t (k )] ⎫ ⎬ ⎨ ⎬ ⎨ 3 3 3 ⎩ 3 ⎭ ⎭ ⎩ cos[ t ( k )] − 1 sin[ t ( k )] ⎫ × I3 (x , y ) + I4 (x , y ) = 0. − × I2 (x , y ) + ⎧ ⎬ ⎨ 3 3 ⎭ ⎩ (13) Therefore, for any point (x , y ) , once we obtain the intensities of this point (I1 (x , y ), I2 (x , y ), I3 (x , y ), I4 (x , y ) ), we can solve the corresponding t (k ) with Eq. (13). Because Eq. (13) is an equation containing the sin( ) and cos( ) functions, it is hard to get t (k ) directly. Considering k (x , y ) is an integer ranging from 1 to 70, we substitute the value of k (x , y ) from 1 to 70 into Eq. (5), and obtain 70 t (k ) s . Then, we substitute the 70 potential values of t (k ) to the left part of Eq. (13) one by one, and get 70 values named Tk (k = 1, 2, 3 … 70). Selecting the one 3

Optics and Laser Technology 125 (2020) 106063

S. Yang, et al.

Fig. 3. Measuring process of a flat. (a) One fringe pattern of the proposed method. (b) One cross section of the wrapped phase and the fringe order of the selected area. (c) The result of 3D reconstruction of the proposed method.

Fig. 4. Absolute phase map of two methods. (a) Absolute phase map of the conventional method. (b) Absolute phase map of the proposed method.

Fig. 5. Absolute phase errors of two methods (a) Absolute phase errors of the conventional method compared with standard reference. (b) Absolute phase errors of the proposed method compared with standard reference.

Fig. 6. Errors of one cross section of two methods.

4

Optics and Laser Technology 125 (2020) 106063

S. Yang, et al.

Fig. 7. The 3D reconstruction results of two methods and their errors distribution. (a) The conventional method, and (b) the proposed method.

method gets a good 3D reconstruction in the camera's viewable range, proving its stability and robustness in measuring the complex object. Finally, to demonstrate the ability of the proposed method in reconstructing isolated objects [27], which should calculate the absolute phase pixel by pixel, we measured two isolated objects shown in Fig. 9(a). We acquired absolute phase of every pixel without referring the value of the surrounding pixels. Fig. 9(a) shows one pattern of two isolated objects captured by the camera. Fig. 9(b) shows one cross section of the wrapped phase and the fringe order of two isolated objects. Fig. 9(c) shows the 3D reconstruction result of the measured objects. Although the wrapped phases of the two isolated objects are discontinues, we can also obtain their correct fringe order and absolute phase. Therefore, the proposed method works pretty well in reconstructing isolated objects.

Table 1 Comparison of the two methods. Method

The number of patterns

The standard deviation (mm)

The total time (s)

The proposed method The conventional algorithm

4 12

0.0257 0.0255

0.4563 0.9620

fringe unwrapping pattern. In the second comparative trial, we took the same 3D point cloud area of two methods and tested their flatness of 3D reconstruction results respectively in Geomagic Studio 2012. As shown in Fig. 7, the standard deviation of conventional method is 0.0255 mm and the standard deviation of the proposed method is 0.0257 mm. It proves that the 3D reconstruction errors of two methods are almost the same. We made a comparison of the two methods, including the number of patterns, the accuracy and the total construction time (with our own matlab programs and the camera's acquisition speed is 15 fps). Seen from Table 1, the accuracies of these two methods are close. Because the number of patterns used in our method is only a third of the conventional method’s, the reconstruction time of the proposed method is much lower than conventional one. Secondly, in order to demonstrate the stability and robustness of the proposed method, we measured a ‘monkey mask’ sculpture. The surface of this sculpture has large curvatures. Fig. 8(a) shows one pattern captured by the camera. Fig. 8(b) shows one cross section of the wrapped phase and fringe order. Fig. 8(c) shows the 3D reconstruction result of the measured object. As shown in Fig. 8(c), the proposed

4. Conclusions We propose a novel method with one unwrapping pattern to reconstruct the 3D shape of the unknown object. Compared with previous methods, we acquire the 3D information accurately by projecting only four patterns. Moreover, the proposed method calculates absolute phase pixel by pixel and doesn't need any information of surrounding pixels or other constraints. Thus, it can measure discontinuous and isolated objects. The experimental results verify the performance of the proposed method in high-accuracy and high-resolution measurement. CRediT authorship contribution statement Shichao Yang: Methodology, Conceptualization, Writing - original

Fig. 8. Measuring process of a ‘monkey mask’ sculpture. (a) The measured object ‘monkey mask’ sculpture. (b) One cross section of wrapped phase and fringe order of the ‘monkey mask’ sculpture. (c) The 3D reconstruction result of the ‘monkey face’ sculpture. 5

Optics and Laser Technology 125 (2020) 106063

S. Yang, et al.

Fig. 9. Measuring process of isolated objects. (a) The measured isolated objects. (b) One cross section of wrapped phase and fringe order of the isolated objects. (c) The 3D reconstruction result of isolated objects.

draft, Software. Gaoxu Wu: Software, Formal analysis. Yanxue Wu: Data curation. Jin Yan: Data curation. Huifang Luo: Visualization. Yinnan Zhang: Formal analysis. Fei Liu: Supervision, Methodology, Writing - review & editing, Funding acquisition.

[13] Z. Huang, J. Xi, Y. Yu, Q. Guo, Accurate projector calibration based on a new pointto-point mapping relationship between the camera and projector images, Appl. Opt. 54 (2015) 347. [14] H. Liu, H. Lin, L. Yao, Calibration method for projector-camera-based telecentric fringe projection profilometry system, Opt. Express 25 (2017) 31492–31508. [15] M. Jenkinson, Fast, automated N-dimensional phase-unwrapping algorithm, Magn. Reson. Med. 49 (2003) 193–197. [16] F. Chen, X. Su, Phase-unwrapping algorithm for the measurement of 3D object, Optik 123 (2012) 2272–2275. [17] B. Bodermann, C. Bräuer-Burchardt, P. Kühmstedt, G. Notni, K. Frenner, R.M. Silver, Phase unwrapping using geometric constraints for high-speed fringe projection based 3D measurements, vol. 8789, 2013, pp. 878906. [18] C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, Q. Chen, Phase shifting algorithms for fringe projection profilometry: a review, Opt. Lasers Eng. 109 (2018) 23–59. [19] H. Deng, J. Deng, M. Ma, J. Zhang, P. Yao, L. Yu, X. Chen, Y. Wang, Directiondetermined phase unwrapping using geometric constraint of the structured light system: the establishment of minimum phase map, Opt. Commun. 402 (2017) 14–19. [20] B. Li, S. Ma, Y. Zhai, Fast temporal phase unwrapping method for the fringe reflection technique based on the orthogonal grid fringes, Appl. Opt. 54 (2015) 6282–6290. [21] L. Huang, A.K. Asundi, Phase invalidity identification framework with the temporal phase unwrapping method, Meas. Sci. Technol. 22 (2011) 035304. [22] M. Servin, M. Padilla, G. Garnica, Super-sensitive two-wavelength fringe projection profilometry with 2-sensitivities temporal unwrapping, Opt. Lasers Eng. 106 (2018) 68–74. [23] M. Servin, J.M. Padilla, A. Gonzalez, G. Garnica, Temporal phase-unwrapping of static surfaces with 2-sensitivity fringe-patterns, Opt. Express 23 (2015) 15806–15815. [24] Y. Xu, S. Jia, X. Luo, J. Yang, Y. Zhang, Multi-frequency projected fringe profilometry for measuring objects with large depth discontinuities, Opt. Commun. 288 (2013) 27–30. [25] M. Zhang, Q. Chen, T. Tao, S. Feng, Y. Hu, H. Li, C. Zuo, Robust and efficient multifrequency temporal phase unwrapping: optimal fringe frequency and pattern sequence selection, Opt. Express 25 (2017) 20381–20400. [26] T. Tao, Q. Chen, J. Da, S. Feng, Y. Hu, C. Zuo, Real-time 3-D shape measurement with composite phase-shifting fringes and multi-view system, Opt. Express 24 (2016) 20253–20269. [27] M. Servin, M. Padilla, G. Garnica, A. Gonzalez, Profilometry of three-dimensional discontinuous solids by combining two-steps temporal phase unwrapping, cophased profilometry and phase-shifting interferometry, Opt. Lasers Eng. 87 (2016) 75–82. [28] D. Zheng, Q. Kemao, F. Da, H.S. Seah, Ternary Gray code-based phase unwrapping for 3D measurement using binary patterns with projector defocusing, Appl. Opt. 56 (2017) 3660–3665. [29] D. Zheng, F. Da, Self-correction phase unwrapping method based on Gray-code light, Opt. Lasers Eng. 50 (2012) 1130–1139. [30] Y. Wang, S. Zhang, Novel phase-coding method for absolute phase retrieval, Opt. Lett. 37 (2012) 2067–2069. [31] M. Ma, P. Yao, J. Deng, H. Deng, G. Zhang, X. Zhong, A morphology phase unwrapping method with one code grating, Rev. Sci. Instrum. 89 (2018) 10. [32] Y. An, J.S. Hyun, S. Zhang, Pixel-wise absolute phase unwrapping using geometric constraints of structured light system, Opt. Express 24 (2016) 18445–18459. [33] J. Lai, J. Li, C. He, F. Liu, A robust and effective phase-shift fringe projection profilometry method for the extreme intensity, Optik 179 (2019) 810–818. [34] K. Liu, Y. Wang, D.L. Lau, L.G. Hassebrook, Gamma model and its analysis for phase measuring profilometry, J. Opt. Soc. Am. A: 27 (2010) 553–562.

Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. Acknowledgements The authors would like to acknowledge the financial support for portions of this research from the National Natural Science Foundation of China (No. 51605059) and the National Key Research and Development Program (No.2016YFE0113600 & No.2018YFB2001400) and the Fundamental Research Funds for the Central Universities (No. 2019CDJGFJX003). References [1] C. Zuo, L. Huang, M. Zhang, Q. Chen, A. Asundi, Temporal phase unwrapping algorithms for fringe projection profilometry: a comparative review, Opt. Lasers Eng. 85 (2016) 84–103. [2] S.S. Gorthi, P. Rastogi, Fringe projection techniques: whither we are? Opt. Lasers Eng. 48 (2010) 133–140. [3] S. Zhang, Recent progresses on real-time 3D shape measurement using digital fringe projection techniques, Opt. Lasers Eng. 48 (2010) 149–158. [4] G. Wu, Y. Wu, L. Li, F. Liu, High-resolution few-pattern method for 3D optical measurement, Opt. Lett. 44 (2019) 3602. [5] K.-C. Chang Chien, H.-Y. Tu, C.-H. Hsieh, C.-J. Cheng, C.-Y. Chang, Regional fringe analysis for improving depth measurement in phase-shifting fringe projection profilometry, Meas. Sci. Technol. 29 (2018) 015007. [6] Z. Huang, Y. Cao, M. Lu, A method for enlarging absolute phase unwrapping range by active phase setting, Optik 124 (2013) 5064–5068. [7] S.G. Liu, L.S. Ji, S.J. Han, X.J. Zhang, H.W. Zhang, An improved composite grating for phase unwrapping in 3-D measurement of specular objects, Adv. Mater. Res. 468–471 (2012) 753–757. [8] F. Zhou, G. Zhang, Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations, Image Vis. Comput. 23 (2005) 59–67. [9] F. Yang, M. Dai, X. He, X. Du, Single fringe projection profilometry based on sinusoidal intensity normalization and subpixel fitting, Opt. Lasers Eng. 49 (2011) 465–472. [10] Z.H. Zhang, Review of single-shot 3D shape measurement by phase calculationbased fringe projection techniques, Opt. Lasers Eng. 50 (2012) 1097–1106. [11] S. Xing, H. Guo, Correction of projector nonlinearity in multi-frequency phaseshifting fringe projection profilometry, Opt. Express 26 (2018) 16277–16291. [12] S. Feng, Q. Chen, G. Gu, T. Tao, L. Zhang, Y. Hu, W. Yin, C. Zuo, Fringe pattern analysis using deep learning, Adv. Photonics 1 (2019) 1.

6