Optik - International Journal for Light and Electron Optics xxx (xxxx) xxxx
Contents lists available at ScienceDirect
Optik journal homepage: www.elsevier.com/locate/ijleo
Original research article
Light plane calibration and accuracy analysis for multi-line structured light vision measurement system Wenguo Lia,*, Hao Lia, Hao Zhangb a b
Faculty of Mechanical & Electrical Engineering, Kunming University of Science & Technology, Kunming 650500, PR China School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou 350108, PR China
A R T IC LE I N F O
ABS TRA CT
Keywords: Multi-line structured light Calibration light plane Accuracy analysis
This paper investigates the overall fitting accuracy of the light plane can be improved by changing the relative distance between camera and laser, altering the projection direction of the light stripe and increasing the number of extracted light stripe in the multi-line structure light measurement system. Firstly, a multi-line structured light three-dimensional measurement system platform is built, and the existing calibration plate is modified according to the requirements of the experimental design. Then, laser stripes are projected onto a blank area in the checkerboard to facilitate the extraction of the centerline of the light stripe. Moreover, the threedimensional (3D) point cloud transformed from centerline of each stripe is used to fit into a plane by a random sample consensus algorithm (RANSAC). Finally, the influence of different experiment methods on the overall fitting accuracy of the light plane is observed. Experiments illustrate that the proposed experimental methods can reduce the relative error of the overall fitting result of the multi-line structured light plane by more than two times, and the overall calibration accuracy of the light plane is improved in terms of the error of angle between the calibrated light planes.
1. Introduction The laser 3D measurement technology is being widely used in many industrial fields such as vehicle assembly [1], 3D reconstruction [2,3] and steel rail quality inspection [4,5] because of its advantages of non-contact, high precision and high speed. At present, the single-line lasers are extensively utilized to scan and reconstruct the measured object. However, the reconstruction data is sparse, and the whole scanning process relies on the translation device. This leads to an increase in the cumulative error of the system, which greatly reduces the measurement accuracy. In addition, this method has a long measurement time and low efficiency. Therefore, many researchers have begun to research on multi-line laser scanning technology [6]. The calibration process of the multi-line structure light vision sensor is mainly divided into three steps: calibration of the camera internal parameters, extraction of the light stripe centerline and calibration of the laser parameters. Camera calibration mainly includes Tsai’s [7,8] method based on radial constraint and Zhang’s [9] method based on planar target. Common strip center line extraction methods are gray centroid method, extreme value method and Gaussian fitting method. These methods are simple to implement but have low precision. Steger [10] used the Hessian matrix to achieve sub-pixel precision of the light stripe centerline, but this way has a lower processing speed. In order to further improve the detection speed and accuracy, scholars have proposed some improved methods [11,12]. However, these improved methods are detrimental to the output of pixel coordinates of the multi-line
⁎
Corresponding author. E-mail address:
[email protected] (W. Li).
https://doi.org/10.1016/j.ijleo.2019.163882 Received 14 September 2019; Accepted 23 November 2019 0030-4026/ © 2019 Elsevier GmbH. All rights reserved.
Please cite this article as: Wenguo Li, Hao Li and Hao Zhang, Optik - International Journal for Light and Electron Optics, https://doi.org/10.1016/j.ijleo.2019.163882
Optik - International Journal for Light and Electron Optics xxx (xxxx) xxxx
W. Li, et al.
stripe centerline. The early line laser parameter calibration methods mainly include the wire drawing method [13], the toothed target method [14] and the principle of cross-ratio invariance [15]. But these methods all have drawbacks such as insufficient accuracy, a small number of the calibration feature points and the difficulty in processing the stereo target. In recent years, many researchers have begun to use planar targets to obtain multiple sets of light plane calibration points [16–19]. All these studies, however, were aimed at the calibration of the single-line structured light plane. Thus, some scholars proposed a multi-line structured light measurement system using two cameras, but the system is more complicated and needs to match the images of the two cameras [20,21]. In addition, some scholars have proposed a method to deduce functions of the multi-light planes [22], which saves the calibration time, but it relies too much on the two light planes that are initially calibrated. Some researchers employed the target to simultaneously calibrate the camera model and the laser plane [23]. In this case, the calibration model gets more complex. In this paper, the existing calibration plate is firstly modified, and its partial area is covered with blank to meet the requirements of the OpenCV-based Zhang's method. Thus, the light stripe picture becomes more continuous and the light stripe centerline gets easier to be extracted. Secondly, the construction of the multi-line structured light three-dimensional measurement system is completed. Zhang's calibration method is adopted to obtain the internal and external parameters of the camera. The laser center line is extracted from the region of interest (ROI) using the Hessian matrix method. The point cloud of each laser centerline is directly fitted into a plane by the RANSAC algorithm without straight line fitting. Finally, the relationships between the overall accuracy of measurement system and the relative distance between camera and laser, the direction in which the laser stripe is projected on the special flat checkerboard and the number of extracted light streaks are investigated respectively. The paper is organized as follows. Section 2 describes the principle of multi-line structured light measurement system. Section 3 introduces calibration of the measurement system including camera calibration and calibration of multi-line structured light plane. Finally, the experimental results and the conclusions are presented in Sections 4 and 5, respectively.
2. The principle of multi-line structured light measurement system The measurement system designed in this paper is mainly composed of three parts: multi-line laser emission module, image acquisition module and image processing module, as shown in Fig. 1. The multi-line laser emission module refers to a laser that emits a multi-line laser and projects it onto the surface of the object to be measured. The image acquisition module mainly collects a line laser stripe image which is deformed due to the height fluctuation of the object surface by a Charge Coupled Device (CCD) image sensor. The image processing module performs subsequent processing on the image collected by the CCD camera to obtain pixel coordinate values of the each light stripe in the image. As a result of the spatial relationship among a world coordinate system (WCS), a camera coordinate system (CCS), a line laser coordinate system and a image coordinate system (ICS), combined with a pixel coordinate system (PCS), the coordinates of the object in the camera coordinate system can be acquired by coordinate transformation, realizing non-contact measurement. The coordinate frame conversion process is shown in Fig. 2. In the measurement system, a series of pixel coordinate values (ui, vi) in the PCS of the multi-line structured light stripe centerline
Fig. 1. The diagram of the measurement system. (1) Camera, (2) Image plane, (3) Laser projector, (4) 25 lines structured light, (5) Measured object and (6) Experiment platform. 2
Optik - International Journal for Light and Electron Optics xxx (xxxx) xxxx
W. Li, et al.
Fig. 2. The coordinate frame conversion process.
are extracted. As shown in Fig. 3, the point PC (XC, YC, ZC) in the CCS can be acquired by the point PI (XI, YI) in the PCS through the coordinate conversion in Fig. 2. After the abnormal data is removed by the RANSAC algorithm, the plane-fitting is performed using the distance-weighted least squares method for the remaining valid data. Finally, the accuracy of the system is detected by comparing the difference between the actual angle of the planes and the factory data. Here, the calibration of the multi-line laser measuring system is accomplished.
3. Calibration of multi-line structured light measurement system In this section, camera calibration and calibration of multi-line laser light planes are presented. Firstly, Zhang’s method [9] is used to realize the camera calibration. Secondly, the sub-pixel coordinate of light strip center is extracted by the Hessian algorithm from the processed ROI. Finally, the coordinates of light strip centerline acquired in PCS are converted to those in CCS. Besides, the RANSAC algorithm is adopted to eliminate outliers while the valid data remained is used for light plane fitting.
3.1. Camera calibration 3.1.1. Camera model A typical diagram of a perspective projection model of the camera in the multi-line structured light measurement system is shown in Fig. 3. PW is a random point in 3D space, whose homogeneous coordinate in the WCS is represented byP˜W = [XW , YW , ZW , 1]T . The coordinate of its projection in the ICS is (u, v), which homogeneous coordinate in the ICS is indicated byP˜I = [XI , YI , 1]T . Under the perspective projection model, the relationship between an arbitrary point in 3D space and its projection on the image plane can be denoted by
s P˜I = A [R, T ] P˜W
(1)
where s is a scale coefficient. A is known as the camera intrinsic parameters matrix, which is shown as follows:
Fig. 3. The geometric structure of the line structured vision sensor. (1) Camera, (2) Image plane, (3) Laser projector, (4) Light plane, (5) Light stripe and (6) Flat checkerboard. 3
Optik - International Journal for Light and Electron Optics xxx (xxxx) xxxx
W. Li, et al.
Fig. 4. Single light plane calibration model. (1) Camera, (2) Laser projector, (3) Light plane and Πi denotes the position of the checkerboard.
⎡1 k 0 u 0 ⎤ ⎥ ⎢ A=⎢ 0 1 l v0 ⎥ ⎥ ⎢ ⎣ 0 0 1⎦ where (u0, v0) is the principal point on the camera sensor, k and l are the sizes of each pixel in the x and y axes of the image sensor, respectively. In other words, the size of the each pixel is indicated as k × l , and the unit is millimeters. [R, T] is called the camera extrinsic parameters matrix which is used to convert coordinates from the WCS to the CCS. The pinhole camera model describes a simple imaging geometry and the effects of lens distortion on imaging are never taken into account. In order to acquire more accurate camera parameters, the nonlinear imaging model of the camera should be used. Considering the influence of radial and tangential distortion, the real projection coordinate is meant as follows:
x + x (k1 r 2 + k2 r 4 + k3 r 6) + 2p1 xy + p2 (r 2 + 2x 2) ⎤ x ⎡ d⎤ = ⎡ y ⎢ d ⎣ ⎦ ⎣ y + y (k1 r 2 + k2 r 4 + k3 r 6) + 2p2 xy + p1 (r 2 + 2y 2 ) ⎥ ⎦
(2) 2
2
2
where (xd, yd) is the real projection coordinate owing to lens distortion. (x, y) is, however, the ideal projection point. r = x + y , k1, k2 and k3 are the factors of the radial distortion, p1 and p2 are the factors of the tangential distortion. 3.1.2. The detailed approach of camera calibration During the calibration process, a flat checkerboard is always employed to get the internal parameters of the camera and the external parameters of the calibration plate, which is put at N (N ≥ 12 in this paper) different poses [24](Π1, Π2, …, Πn, shown in Fig. 4) from the camera and the laser. In the paper, Zhang’s method is used to realize the camera calibration. The flat checkerboard is shown in Fig. 5. Its pattern array is 12 × 9 and the size of the checker is 10 × 10 mm. The purpose of this layout is explained in detail in Section 3.2.3. The concreted processing steps for camera calibration are as follows: Step 1: Place the flat checkerboard in N different positions, and the arranged positions can cover entire angle of view of camera fully. Step 2: Three pictures are taken while the calibration plate is located at each position. These pictures include the following three groups: the first one is the checkerboard image with illumination only, the second one is the background image without illumination and laser, and the third one is checkerboard image with laser stripe only. One set of pictures taken at a certain position is shown in Fig. 6. Step 3: The checkerboard image with illumination only in each position is used for camera calibration. Then, the total number of pictures for camera calibration is N. Step 4: The remaining two pictures in each position are used for the calibration of light plane of multi-line laser, which is described in detail in Section 3.2.1. Step 5: Zhang’s plane calibration method is adopted to calculate internal parameters of the camera and the external parameters of the checkerboard. 3.2. Calibration of multi-line structured light plane Calibration of light plane plays a significant role in multi-line structured light 3D measurement system. In order to realize the calibration of light plane of multi-line laser, it is necessary to obtain the pixel coordinates of the structural light center line, and then 4
Optik - International Journal for Light and Electron Optics xxx (xxxx) xxxx
W. Li, et al.
Fig. 5. The plat checkerboard used in the paper.
Fig. 6. One set of images in one of position.
3D coordinate of point located on object surface corresponding to the extracted centerline image is calculated. In addition, 3D coordinate of centerline of stripe on object surface is used to calibrate the light plane. Herein, this part introduces the extraction of centerline of light strip, the general process framework of the light plane calibration and its detailed calibration process. 3.2.1. Extraction of centerline for structured light stripe The extraction of structured light stripe is an essential part of structured light vision measurement. The following steps are required to process the pictures taken by industrial camera, which are used to increase the precision and speed of extracting centerlines of light stripe. The following is the processing steps for the extraction of stripe centerline: Step1: N pictures with light stripe are obtained by camera, which are based on the background images taken before. The linear blending is employed on the two pictures to remove interference generated by other factors in the environment. Step2: In the article, for the reason that multi-line structured light is applied, the ROI needs to be set in order to find information about the desired light stripes. And the setting of ROI can also reduce time for processing pictures and improve the detection speed of the light stripe. Step3:It is necessary to filter the image to highlight the effective information, which is beneficial to the extraction of laser stripe center. Considering that there is mainly Gaussian noise in the picture, the Gaussian filtering is used to remove noise. Step4:The sub-pixel coordinate value of light strip center is extracted by the Hessian algorithm [10] from the processed ROI. The detection of structured light stripe center is shown in Fig. 7. The highlight in red is the visualization of the extracted centerline. 3.2.2. The whole process framework of the light plane calibration The following processing steps are used to calibrate the light plane of multi-line laser: Step 1: The camera inner parameters and exterior parameters of checkerboard are calculated through camera calibration shown in section 3.1.2. Step 2: The camera inner parameters and exterior parameters of checkerboard are used to calculate the coordinate of the center point in light stripe in CCS. Step 3: The RANSAC algorithm is adopted to eliminate outliers while the valid data remained is used for processing the light plane fitting. The effect diagram of the processing is shown in Fig. 11. 5
Optik - International Journal for Light and Electron Optics xxx (xxxx) xxxx
W. Li, et al.
Fig. 7. The detection of structured light stripe center.
Step 4: Minimize the residuals of distance from all the 3D measurement points to the fitted planes equation. Step 5: The other 24 light planes are respectively processed applying the same method as Step 1 ∼ 4. Step 6: Calibration results are evaluated, and a series of experiments are set to observe calibration results by changing the relative distance between the laser and the camera, increasing the number of extracted light stripes and altering the projection direction of the light stripe. 3.2.3. The detailed light plane calibration approach Fig. 4 shows a light plane calibration model which associates the line laser coordinate system with the CCS. The multi-line structured light is projected onto a specific plat checkerboard, as shown in Fig. 8, forming 25 light stripes (l1, l2,…, l25), and PW is a center point on l1. If an ordinary black-and-white checkerboard with no blank areas is employed, the absorption of light by the black grid in the checkerboard impedes the extraction of centerline of line structure light. A great number of target images in different poses are taken to obtain the internal parameters of the camera and the external parameters of the target at different positions. The process of transforming the pixel coordinates of the center point of light stripes and the process of fitting the equation of light plane are as follows: (1) 3D coordination calculation for centerline in CCS. The coordinates of the point (XC, YC, ZC) for the light strip located on measured object in CCS can be calculated by combining the image coordinates of centerline for light stripe, the internal parameters of the camera and the external parameters of the target. At this point, there is a corresponding formula derivation, where the Eq. (1) can be transformed into:
⎡ XW ⎤ u v ⎤ = A [R, T ] ⎢ YW ⎥ s⎡ ⎢ ZW ⎥ ⎢ ⎥ ⎣1⎦ ⎢ ⎥ ⎣ 1 ⎦
(3)
u ⎛ ⎡ XW ⎤ ⎞ v ⎤ = A ⎜R ⎢ YW ⎥ + T ⎟ s⎡ ⎢ ⎥ ⎜ ⎢Z ⎥ ⎟ ⎣1⎦ ⎝ ⎣ W⎦ ⎠
(4)
The formula for transforming the point PW of the WCS into the point PC of the CCS is:
Fig. 8. The light plane calibration model. (1) Specific plat checkerboard, (2) Camera, (3) 25 lines structured light and (4) Laser. 6
Optik - International Journal for Light and Electron Optics xxx (xxxx) xxxx
W. Li, et al.
(5)
PC = RPW + T T
T
where P is an arbitrary point with coordinates PW = [XW, YW, Zw] and PC = [XC, YC, ZC] in the WCS and CCS. R is the orthogonal rotation matrix and T is the translation vector, which have been defined in section 3.1.1. Therefore, the camera coordinates of the center point of light strip can be described as follows:
PC = A−1 ⋅s⋅P˜I
(6)
Thus, camera coordinates of many points on these strips are obtained. (2) Fitting the equation of light plane using 3D coordination of centerline in CCS. Since the laser position is fixed, the obtained center points of light stripe should be on the same plane, as shown in Fig. 4, and these points are used to calibrate the light plane parameters by fitting. The equation of the laser light plane in CCS can be given by:
Ai XC + Bi YC + Ci ZC + Di = 0, (Ci ≠ 0, i = 1, 2,⋯, 25)
(7)
where Ai, Bi and Ci, coupled with Di are the plane equation coefficients of the light plane π1, π2, ⋯, andπ25, respectively. In the process of light plane fitting, due to the influence of system error and measurement environment, the three-dimensional coordinates of the points obtained by visual measurement have errors. Besides, the errors vary with locations of the points. Therefore, the RANSAC algorithm is adopted to remove abnormal data while the data left is used for plane fitting. The objective of plane fitting is to minimize the residuals of distance from all the measurement points to fitting planes considering the errors exit in data obtained in the x, y and z directions. Then, the calculation process of the optimal light plane is as follows: Step 1: Randomly select three feature points in the set of light plane feature points to observe whether the three points are on a straight line. If not, the coefficients A, B, C and D calculated through the plane equation of the three points are set as the initial values for optimizing. Step 2: Calculate the distance di from each point in the dataset to the model plane, and count the number of points whose distance di is less than a certain threshold. That is the number of inliers.
di =
|AXci + BYci + CZci + D| A2 + B2 + C 2
(8)
Step 3: Step 1 and Step 2 are repeated, and the plane containing the most inliers is selected after k iterations. Step 4: The above process are repeated for 30 times, and the plane with the most inliers is chosen as the best fit plane. Step 5: Re-estimate the plane containing the most inliers by least squares. Overall light plane calibration accuracy analysis for the measurement system. The plane equations of 25 light planes are respectively fitted. The angles between two random planes and the angle between the first and last planes are calculated, then comparing them with the factory parameters of the laser, such as β and γ are introduced in Fig. 9, to assess the calibration results. The angle between the light planes is calculated by the following formula:
cos θ =
|Ai Aj + Bi Bj + Ci Cj | Ai2 + Bi2 + Ci2 • Aj2 + Bj2 + Cj2
⎛ ⎞ ⎜i, j = 1, 2, ⋯, 25, i ≠ j⎟ ⎝ ⎠
(9)
where Ai, Bi and Ci are the plane equation coefficients of the light plane π1, π2, ⋯, andπ25 , respectively, which have been defined in Eq. (7). After that, how these factors affect the overall light plane calibration accuracy of the measurement system after separately changing the relative distance between the laser and the camera, increasing the number of extracted light stripes and altering the projection direction of the light stripe are investigated through calculation and comparison manner of the light plane angles
Fig. 9. Schematic pattern of the multi-line structure light. 7
Optik - International Journal for Light and Electron Optics xxx (xxxx) xxxx
W. Li, et al.
Table 1 The main parameters of the camera. Model Number
Pixels(H × V)
Frame Rate(fps)
Pixel Size, H × V ( μm )
Type
acA1600-20gm
1,628 × 1,236
20
4.4 × 4.4
Monochrome Camera
Table 2 The main parameters of the lens. Model Number
Focal length (mm )
Iris range
Working distance (mm )
HF9HA-1B
9
F1.4 - F16
∞ - 100
Manufacturer Fujifilm
Table 3 The main technical parameters of the laser. Element Number
Description
Size (∅×Thickness)
Design wavelengths
DE-R 254
25 Lines (Square)
20 × 1.2 mm
660 nm
Table 4 The detailed parameters of the laser line pattern. Design wavelengths
660 nm
Pattern Size @ 100 mm Distance (mm)
Pattern Angles (°)
Image
a
b
c
d
α
β
γ
δ
67
48
1.98
48
37
27
1.11
27
Fig. 10. The multi-line structured light measurement platform.
8
Optik - International Journal for Light and Electron Optics xxx (xxxx) xxxx
W. Li, et al.
Fig. 11. Light plane fitting image. Table 5 The overall calibration mean errors. Group
Number of pictures
Overall mean error (unit: pixel)
1 2 3 4 5 6 7
14 12 12 19 15 15 12
0.0645956 0.0748388 0.0676694 0.0685085 0.0705704 0.0721234 0.0634215
Table 6 Results with real data of light plane function. Number of the light plane
The coefficients of the light plane equation
1 2 3 4 13 22 23 24 25
A
B
C
D
0.031548 −0.029431 0.029029 0.030298 0.0067318 −0.019806 −0.021954 −0.023346 −0.023757
−0.96707 0.97181 −0.97623 −0.98041 −0.99982 −0.98704 −0.98355 −0.98006 −0.97636
−0.25253 0.23392 −0.2148 −0.1946 −0.017803 0.15927 0.17928 0.19731 0.21485
4.3265 −3.6292 3.2194 2.6627 −0.4732 −1.3417 −2.1732 −2.5369 −2.2822
Table 7 Comparison of β for laser at different positions. Position
Group
Factory default (°)
Fitted value (°)
Relative error (%)
1 2 3 3 4 5
1 1 1 2 1 1
27 27 27 27 27 27
27.2176 26.3951 26.5404 27.2265 25.2667 24.9969
0.8059 2.2404 1.7022 0.8389 6.4196 7.4189
mentioned above.
4. Experiment To validate the proposed experimental approaches, a series of experiments are performed. The essential parameters of the camera, 9
Optik - International Journal for Light and Electron Optics xxx (xxxx) xxxx
W. Li, et al.
Table 8 Comparison of β values to the laser stripes in different projection directions. Position
Group
Projection direction
Factory default (°)
Fitted value (°)
Relative error (%)
3 3
2 3
Level Vertical
27 27
27.2265 26.5760
0.8389 1.5704
lens and laser are shown in Tables 1, 2 and 3, respectively, and the camera is manufactured by Basler. The detailed parameters of the laser pattern are shown in Fig. 9 and Table 4. The multi-line structured light measurement platform is developed, as shown in Fig. 10. During the experiment, seven repetitive tests are performed on the camera calibration. The overall calibration mean error is calculated by the process in Section 3.1.2, and the test results are shown in Table 5. The overall average calibration error is less than 1 pixel, which demonstrates that Zhang’s method is flexible and has good calibration accuracy. Then, one set of the calibration results is as follows:
0 829.434241⎤ ⎡ 2158.805051 A=⎢ 0 2165.137268 613.378645⎥ 0 0 1 ⎦ ⎣
K = [k1 k2 p1 p2 k3] = [−0.2990990.457031 − 0.0006 − 0.004183 − 0.813234] In the result above, A represents the internal parameters matrix of the camera and K is the distortion coefficient. The plane equations of 25 light planes are respectively fitted by the process in Section 3.2.3. The fitting of each light plane requires approximately 2500 points. The fitting result of a certain plane is shown in Fig. 11. As shown in Fig. 11, the red highlighted point is the data of the plane fitting after the abnormal data has been removed. The coefficients of the partial light plane equation when the laser is at position 3 are shown in Table 6. A series of experiments are performed to investigate the influences of the distance between camera and laser, the projection direction of light stripes and the number of extracted light stripe on the overall calibration accuracy of the light plane. It should be noted that in the experiment, only the position of the laser is changed and the angle β between the first and last planes is selected as the parameter of comparison. In the first group, the laser is fixed at five different positions away the camera, as shown in Fig. 10. The angle β obtained from the calibration is compared with the initial value of the laser, as shown in Table 7. In the second group, the number of extracted stripes increases at the position 3 in the first set of experiments, and the experimental comparison is shown in Table 7. From the Table 7, it can be seen that the overall fitting accuracy of the light plane is high at the positions 1, 2, and 3, and the relative errors of the positions 2 and 3 are more than twice that of the position 1. At the positions 4 and 5, the relative errors of the fitting are much higher than that at position 1. At the position 3, the number of extracted light strips increases without changing the projection direction of the light stripe in the second group. Then, the relative error of the overall fitting result of the multi-line laser light plane is reduced by twice at least. In the third group, the projection direction of the light stripe is changed while the position of the laser stays the same. The experimental comparison data is shown in Table 8. Since the RIO is a rectangle, both horizontal and vertical projections are conducted for comparison. According to the Table 8, when the position and the number of extracting light stripes are kept unchanged, the relative error of the horizontal projection mode is much lower than that of the vertical one. 5. Conclusion This paper proposes a multi-line structured light three-dimensional measurement system. The measurement system experimental platform is built, and the overall fitting accuracy of the light plane is carried out by a series of experiments. Firstly, the existing calibration plate is modified according to the requirements of the experimental design. The laser stripes are projected onto a blank area in the checkerboard to facilitate the extraction of centerline of the light stripe. Moreover, the point cloud extracted from each strip is fitted into a plane by the RANSAC algorithm. Finally, the influence of different experiment methods on the overall fitting accuracy of the light plane is observed. The experiment results indicated that the relative distance between the camera and the laser, the projection direction of light stripes and the number of extracted light stripes in the measurement system all affect the overall fitting accuracy of the light plane. As a result, the accuracy of the three-dimensional measurement system is also influenced. The overall accuracy of the light plane in the position 1 is relatively high. In short, a larger number of extracted light stripes, an appropriate projection direction of light stripes and a proper relative distance between laser and camera can improve the accuracy of the measurement system, which may provide some helpful advice for constructing the multi-line structured light three-dimensional measurement system. References [1] T.T. Tran, C. Ha, Non-contact gap and flush measurement using monocular structured multi-line light vision for vehicle assembly, Int. J. Control Autom. Syst. 16 (2018) 2432–2445.
10
Optik - International Journal for Light and Electron Optics xxx (xxxx) xxxx
W. Li, et al.
[2] Z.Z. Wang, Y.M. Yang, Single-shot three-dimensional reconstruction based on structured light line pattern, Opt. Laser Eng. 106 (2018) 10–16. [3] J.H. Deng, B.Q. Chen, X.C. Cao, B. Yao, Z.Q. Zhao, J.X. Yu, 3D reconstruction of rotating objects based on line structured-light scanning, 2018 International Conference on Sensing, Diagnostics, Prognostics, and Control (Sdpc) (2018) 244–247. [4] P. Zhou, K. Xu, D.D. Wang, Rail profile measurement based on line-structured light vision, IEEE Access 6 (2018) 16423-+. [5] C. Wang, Y.F. Li, Z.J. Ma, J.Z. Zeng, T. Jin, H.L. Liu, Distortion rectifying for dynamically measuring rail profile based on self-calibration of multiline structured light, IEEE T Instrum. Meas. 67 (2018) 678–689. [6] B. Wu, T. Xue, T. Zhang, S.H. Ye, A novel method for round steel measurement with a multi-line structured light vision sensor, Meas. Sci. Technol. 21 (2010). [7] Roger Y. Tsai, An efficient and accurate camera calibration technique for 3D machine vision, IEEE Trans. Pattern Anal. Mach. Intell. (1986) 364–374. [8] Roger Y. Tsai, Automation, A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses, IEEE J. Robot. Autom. 3 (1987) 323–344. [9] Z.Y. Zhang, A flexible new technique for camera calibration, IEEE Trans. Pattern Anal. Mach. Intell. 22 (2000) 1330–1334. [10] C. Steger, An unbiased detector of curvilinear structures, IEEE Trans. Pattern Anal. Mach. Intell. 20 (1998) 113–125. [11] F.-q. ZHOU, Q. Chen, G.-J. ZHANG, Composite image processing for center extraction of structured light stripe, J. Optoelectron. Laser 11 (2008) 1534–1537. [12] Z.Y. Xu, H.Y. Zheng, H.Y. Yang, An improved Gaussian fitting method used in light-trap center acquiring, in: applied Mechanics and Materials, Trans. Tech. Publ. (2014) 449–452. [13] R. Dewar, Self-generated targets for spatial calibration of structured-light optical sectioning sensors with respect to an external coordinate system, Soc. Manuf. Eng. (1988). [14] D. Fajie, A new accurate method for the calibration of line structured light sensor, Chin. J. Sci. Instrum. 21 (2000) 108–110. [15] D.Q. Huynh, R.A. Owens, P.E. Hartmann, Calibrating a structured light stripe system: a novel approach, Int. J. Comput. Vision 33 (1999) 73–86. [16] Jigui Zhu, Yanjun Li, Shenghua Ye, A speedy method for the calibration of line structured light sensor based on coplanar reference target, Chin. Mech. Eng. 17 (2006) 183–186. [17] Z. Liu, G. Zhang, Z. Wei, J. Jiang, An accurate calibration method for line structured light vision sensor, Acta Opt. Sin. 11 (2009) 3124–3128. [18] F.Q. Zhou, G.J. Zhang, Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations, Image Vision Comput. 23 (2005) 59–67. [19] Z. Wei, M. Xie, G. Zhang, Calibration method for line structured light vision sensor based on vanish points and lines, 2010 20th International Conference on Pattern Recognition, IEEE (2010) 794–797. [20] C. Sinlapeecheewa, K. Takamasu, 3D profile measurement using color multi-line stripe pattern with one shot scanning, Integr. Comput. Eng. 12 (2005) 333–341. [21] M. Pashaei, S.M. Mousavi, Implementation of a low cost structured light scanner, Isprs - Int. Arch. Photogramm. Remote. Sens. Spat. Inf. Sci. (2013) 477–482 405-W2. [22] Y. Zou, L. Niu, B. Su, Y. Sun, J. Liu, Calibration of 3D imaging system based on multi-line structured-light, in: three-dimensional image acquisition and display technology and applications, Int. Soc. Optics Photonics (2018) pp. 108450C. [23] T.-T. Tran, C. Ha, Manufacturing, Extrinsic Calibration of a Camera and Structured Multi-line Light Using a Rectangle vol. 19, (2018), pp. 195–202. [24] W. Li, S. Fang, S. Duan, 3D shape measurement based on structured light projection applying polynomial interpolation technique, Opt. - Int. J. Light Electron. Opt. 124 (2013) 20–27.
11