Image analysis for 3D micro-features: A new hybrid measurement method

Image analysis for 3D micro-features: A new hybrid measurement method

G Model ARTICLE IN PRESS PRE-6488; No. of Pages 10 Precision Engineering xxx (2016) xxx–xxx Contents lists available at ScienceDirect Precision E...

2MB Sizes 0 Downloads 23 Views

G Model

ARTICLE IN PRESS

PRE-6488; No. of Pages 10

Precision Engineering xxx (2016) xxx–xxx

Contents lists available at ScienceDirect

Precision Engineering journal homepage: www.elsevier.com/locate/precision

‘Image analysis for 3D micro-features: A new hybrid measurement method Percoco Gianluca a,∗ , Modica Francesco b , Fanelli Stefano c a b c

Politecnico di Bari, Dipartimento di Meccanica, Matematica e Management, Viale Japigia, 182, 70126 Bari, Italy Consiglio Nazionale delle Ricerche, Istituto per le Tecnologie Industriali e l’Automazione, Via Paolo Lembo, 38/F, 70124, Bari, Italy Masmec spa, Via dei Gigli 21, 70026 Modugno (Ba), Italy

a r t i c l e

i n f o

Article history: Received 1 August 2016 Received in revised form 18 November 2016 Accepted 19 November 2016 Available online xxx Keywords: Measurement Micro-features Photogrammetry Depth from focus WireEDM Scale International standards

a b s t r a c t Measuring 3D micro-features is a challenging task that is usually performed using high-cost systems generally based on technologies with narrow working ranges and very accurate control of the sensor positions. Well-known image analysis methods, such as Photogrammetry, would likely lower the costs of 3D inspection of micro-features and add texture information to the 3D models; however, the behaviour of Photogrammetry is strongly affected by the scaling method because it retrieves a model that must be scaled after its computation. In this paper, an experimental study of the validity of a hybrid 3D image processing method for measuring micro-features is presented. This method exploits the Depth-fromFocus method to retrieve the correct scale for a photogrammetric model. The measurement of properlydesigned and manufactured specimens was performed by following and adapting the German guideline VDI/VDE 2634, Part 3 to validate the method using calibrated specimens. The proposed system has been demonstrated to be very promising can achieve an error of less than 10 ␮m. © 2016 Elsevier Inc. All rights reserved.

1. Introduction Photogrammetry is a non-contact optical technique that can be used to obtain geometric information about physical objects through images captured from several points of view using stereoscopic principles. Photogrammetry is used in several fields of application and depend on the object-camera distance involved. The focus of the present work is the measurement of micro-features of objects that have a maximum size smaller than 10 millimetres. This field of application of Photogrammetry is still poorly explored and has been primarily used in studies concerning the natural sciences, in particular, the study of fossils or their fragments [1], and in medical applications such as surgery [2], and odontoiatric field [3], or very few industrial research applications for the inspection of very small components [4–7]. The domain of Photogrammetry applied to micrometric metrology is promising as a result of the latest technological

advancements in hardware and software. It can reduce time and costs while maintaining the accuracy and precision of the results, and could compete with other more expensive techniques such as interferometry, confocal microscopy, etc. As a measurement tool, Photogrammetry should be validated through a standard reference procedure for detecting and measuring the key parameters to characterize its performance. Recently, despite the profusion of optical 3D digitizing systems, the international standards for their validation are not available. The main guideline is the German VDI/VDE 2634 [8], widely used within the scientific community [3,9–24], in several applications but, to the best of the authors’ knowledge, never for photogrammetry of micro-features. In the literature [1,4–6,11,18,25–33], it is stated that one of the most important problems of the photogrammetric technique is the attribution of the scale to reconstructed point clouds due to an inherent limitation of the technique. The possible scaling methods are:

∗ Corresponding author. E-mail addresses: [email protected] (P. Gianluca), [email protected] (M. Francesco), [email protected] (F. Stefano).

1 Using a known distance between two markers within the images [1,4,11,18,25–34] an algorithm will recognize the markers in all the photos and will scale the entire object based on the known distance. This method is very functional for meso to large- sized objects, but it has disadvantages for small measurement vol-

http://dx.doi.org/10.1016/j.precisioneng.2016.11.012 0141-6359/© 2016 Elsevier Inc. All rights reserved.

Please cite this article in press as: Gianluca P, et al. ‘Image analysis for 3D micro-features: A new hybrid measurement method. Precis Eng (2016), http://dx.doi.org/10.1016/j.precisioneng.2016.11.012

G Model PRE-6488; No. of Pages 10

ARTICLE IN PRESS P. Gianluca et al. / Precision Engineering xxx (2016) xxx–xxx

2

umes: the higher the magnification, the lower the field of view leading to two main issues: (i) the markers must be smaller with increasing costs and technical problems in fabricating them (ii) blurring involves higher image areas. In these conditions, the marker detection becomes very difficult. 2 Placing the camera/s in known positions [18,35], or at a known distance between each other, this technique reproduces aerial Photogrammetry where each photo is geo-mapped through GPS [5]. This condition can be reproduced in micro-measurements, only through the use of expensive lab facilities and instruments that would increase the cost of the photogrammetric method. These limitations have lead the authors of the present paper to design a new scaling method with the advantage of measuring and computing the scale using the same photogrammetric equipment. This method hybridizes the Depth-from-Focus (DFF) method with photogrammetry based on the identification of the focus plane, defined as the parallel plane to the sensor, positioned at a distance equal to the focus distance of the camera. This paper describes a new scaling method and demonstrates its effectiveness through the application of the guideline [8], which regulates the acceptance tests for optical scanning systems. This guideline is applied to a low-cost photogrammetric system consisting of one SLR camera, one macro lens and optical extension tubes, and one motorized rotary table. The guideline [8] requires workpieces with sizes that must be comparable to the working range that must be accurately manufactured. Consequently, specifically-designed workpieces were manufactured using microelectro discharge machining (␮-EDM). The choice of this process was due to two aspects, namely: (i) high accuracy in the microscale and (ii) non-reflecting surfaces caused by the presence of micro-craters induced by the process. In Section 2, the experimental setup is described (2.1), after which the hybrid scaling methodology is proposed (2.2) and then by the adaptation of the guideline [8] to micro-Photogrammetry (2.3). In Section 3, the results of the application of the method to selected workpieces are reported and the discussion of results follows in Section 4. 2. Materials and methods 2.1. Experimental setup The photogrammetric system employs a Canon EOS 400D SLR Camera with macro lens Canon EF – S 60 mm f/2.8 mounted on a tripod stand. The camera has a resolution of 10.1 megapixels (3888 × 2592), with an aspect ratio of 3: 2 while the sensor size is 22.2 mm × 14.8 mm. This camera is also equipped with a 36 mm extension tube to increase the magnification ratio. Unlike a lens, an extension tube does not change the lens optical system and allows low working distances comparable to those of zoom lenses. On the other hand, they reduce the depth of focus so need more light intensity. This and other extension tubes on the same camera have already been studied in [4]. The images are captured remotely after adjusting the f-number aperture, defined as the ratio between the lens focal length and the diameter of the entrance pupil of the diaphragm. The f-number affects the depth of field so is kept constant during the entire calibration and image acquisition process. Illumination is provided by a cold-light LED strip that provides of a uniform distribution of the light. Finally, a stepper motor rotary Table ISEL RF1 (Fig. 1), with controller IT116G AND minimum settable step equal to 1◦ , is adopted to capture the object from several angles with a constant step. The experimental setup is very simple and low-cost. It will be demon-

Fig. 1. Tools used in 3D acquisition of the object.

strated to be very promising when compared to more expensive methodologies, with savings up to 70% when compared to conoscopic holography, which is one of the cheapest technologies. The photogrammetric technique for micro-features follows the methods reported in [4], [6] and is divided into three main phases: (i) camera calibration; (ii) capturing and 3D reconstruction of the workpiece; (iii) scaling. Phase (i) was carried out with the method described in [4], employing the open source computer vision library Open CV, version 3.0.0, the 3D reconstruction of phase (ii) was performed using the image processing software Agisoft PhotoScan Pro version 1.2.3. The 3D reconstruction of the object depends on the combination of the images captured by the camera at different positions: in the present paper, all the images have been captured at the minimum focusing distance exploiting the rotation of the table, following a ring strategy. The reconstruction process by Agisoft PhotoScan is divided into two main phases: alignment phase and dense surface modelling phase. The alignment detects the key points on the images, then the software processes this data to estimate the external calibration parameters, i.e. camera locations. This task is performed using the Scale Invariant Feature Transform (SIFT) [36], which recognizes features, invariant to image scaling and rotation, suitable for matching different images independently by camera locations. The output of the process is a cloud of key points that represent the local descriptors that are the basis for the Dense Surface Modelling step based on Structure from Motion (SFM) algorithms. In the alignment phase, the coordinates of the generic 3D keypoint (px, py, pz) together with its correspondences in the image (qx, qy) are used to compute the elements of the projection matrix [37]. Given  as a scale factor, the generic 3D point will correspond to the ith 2D point on the image according to the following: q = Mp∀ ∈ 



qx





m11

(1) m12

m13

⎢ ⎥ ⎢ ⎣ qy ⎦ =  ⎣ m21 m22 m23 1

m31





m32

m33

⎤ px ⎢ ⎥ py ⎥ ⎥⎢ ⎥ m24 ⎦ ⎢ ⎢p ⎥ ⎣ z⎦ m14

m34

(2)

1

with M defined as the transformation matrix. Since Eqs. (1) and (2) are verified for every real , there will be an infinite number of terms (px, py, pz), each single keypoint in the 3D space, satisfying Eqs. (1) and (2). In photogrammetric practice  is estimated by the software, resulting in a random real number that must be modified

Please cite this article in press as: Gianluca P, et al. ‘Image analysis for 3D micro-features: A new hybrid measurement method. Precis Eng (2016), http://dx.doi.org/10.1016/j.precisioneng.2016.11.012

G Model PRE-6488; No. of Pages 10

ARTICLE IN PRESS P. Gianluca et al. / Precision Engineering xxx (2016) xxx–xxx

3

Fig. 2. Sequence of the proposed scaling method.

by the user a posteriori, scaling the point cloud as explained in Section 1. 2.2. The proposed scaling methodology The scaling method proposed in this paper is shown in Fig. 2 and finds the factor ␭ through the following steps, under the assumption that we know the magnification ratio M (the ratio between the size of an object in the image and its true size) of the camera with the considered extension tube and L the lateral size of the pixel: 1 Capture of the set of photos of the workpiece, performed both with large aperture, for example f2.8 (Fig. 2a), and smaller aperture (Fig. 2b). The smaller aperture must be chosen as a compromise between a depth of field as high as possible and diffraction effect as low as possible. With the experimental set exploited in the present paper, this aperture was estimated to be f20. 2 Focus area detection(Fig. 2c) and subsequent detection of the focal plane represented by the middle polyline, passing through focus areas, identified on larger aperture (f2.8) images (Fig. 2d and e) 3 Transposition of the polyline on the same pixels of the smaller aperture (f20) image (Fig. 2f)

4 Identification of at least two geometric points that lie on the polyline and measure the distance between these points in pixels (Fig. 2g). 5 The distance in pixels is converted into millimetres through the magnification ratio M and the pixel size L 6 The distance in millimetres is provided to the photogrammetric software that will compute the scale factor ␭.

The pixel size L is a camera specification given by the camera supplier, while the magnification ratio M can be easily estimated experimentally. In Fig. 2(a) and (b), an example of the same image captured with f2.8 and f20 respectively, is shown. It is possible to notice the effect of the higher depth of field in image b. The low depth of field (a) is used in the proposed method to detect the focal plane, i.e. the plane parallel to the sensor positioned at the focus distance. In fact, all the points on the image that are on focus, are located inside a prism with average plane coinciding with the focal plane and thickness equal to the depth of field. The focal curve, i.e. the curve resulting from the intersection between the focal plane and the workpiece will be the locus of the points belonging to the workpiece, located at the focal distance from the sensor. The detection of the focus areas is possible by means of edge-detection filters, such as Sobel filter, typically used

Please cite this article in press as: Gianluca P, et al. ‘Image analysis for 3D micro-features: A new hybrid measurement method. Precis Eng (2016), http://dx.doi.org/10.1016/j.precisioneng.2016.11.012

G Model PRE-6488; No. of Pages 10

ARTICLE IN PRESS P. Gianluca et al. / Precision Engineering xxx (2016) xxx–xxx

4

Fig. 3. calibrated specimens used in VDI/VDE 2634 guidelines: a) sphere (http://www.aimess-products.de/produkte/targets/keramikkugel/); b) “ball-bar”; c) plane (http:// www.aimess-products.de/produkte/pruefkoerper/ebenheitsnormal/).

in image processing (in Fig. 2c the result of Sobel filter used with the freeware GIMP-GNU Image Manipulation Software). The basic consideration is that, after converting into greyscale, images containing sharp edges result in high intensity gradient of the greyscale image. When the image is blurred, the gradient is low [38]. Due to the geometry of the workpiece shown in Fig. 2, the focal curve degenerates into a polygonal chain, shown in red in Fig. 2d–g, due to the prismatic geometry of the workpiece. Onto this curve, the real three-dimensional distance between two points on the workpiece, coincides with the bi-dimensional Euclidean distance between the pixels in the sensor, multiplied per the magnification ratio M. The Euclidean distance in micrometres can be easily computed, knowing the bi-dimensional coordinates of the points in the image reference system and the pixel size L. The last step for the attribution of the scale of the object is to impose the computed distance, in micrometres, between the same pixels onto the smaller aperture (f20) image (Fig. 2f), as shown in Fig. 2g. The step 6 is performed with Agisoft Photoscan, using its markers: (i) the entire set of images f20 are opened with the software; (ii) the image in Fig. 2b is extracted; (iii) two markers are placed respectively in points C and H; (iv) through the creation of a Scale Bar, from the 3D view context menu, the distance CH in millimetres is imposed. Consequently, the whole 3D model is automatically scaled according to the distance.

2.3. Validation with VDI/VDE 2634 2.3.1. Guideline for non-tactile measuring systems The guideline [8] defines the procedure to evaluate optical measuring instruments by measuring the following parameters: • Probing Error in Size and the Sphere Spacing Error, using respectively the diameter of a sphere (Fig. 3a) and the distance between the centres of two spheres (Fig. 3b); • Probing Error in Form and Flatness Measurement Error, using one sphere (Fig. 3a) and one plane (Fig. 3c), respectively.

According to [8], these tests must be performed on specimens measured with a known uncertainty level. These measurement results are obtained by instruments with uncertainties at least 4 times lower than those expected by the System under Test (SUT). The size of the specimens must have a fixed relationship to the measurement volume of the SUT (discussed in Section 2.3.2). The Probing Error in Size (PES) and the Sphere Spacing Error (SSE) are the most significant tests to validate the scaling method proposed by the authors, since the scaling method directly affects the size of the 3D point cloud. Consequently, only PES and SSE have been considered in this paper.

Fig. 4. Generation of the measurement volume of photogrammetric technique.

2.3.2. Working range computation The measurement volume of a rotary photogrammetric system is not universally determined but it has a shape and size depending on the relative position between the rotary table and the camera. Its shape is derived from the rotation of the focus plane, parallel to the camera sensor, around the table rotation axis. More precisely, the measurement volume is also derived from the rotation of a prism because the depth of field adds a virtual thickness to the focus plane (Fig. 4). However, because the depth of field much lower than the lateral size of the sensor, it has been neglected for volume computation. It must be noted that this assumption leads to the computation of measurement volumes smaller than those obtained without neglecting the depth of field. The consequence is

Please cite this article in press as: Gianluca P, et al. ‘Image analysis for 3D micro-features: A new hybrid measurement method. Precis Eng (2016), http://dx.doi.org/10.1016/j.precisioneng.2016.11.012

G Model

ARTICLE IN PRESS

PRE-6488; No. of Pages 10

P. Gianluca et al. / Precision Engineering xxx (2016) xxx–xxx

5

Fig. 5. Shape and size variation of the measurement volume of photogrammetric technique.

the need for smaller specimens leading to stricter requirements for the SUT. The measurement volume may take several shapes and sizes, such as those shown in Fig. 5, depending on the configuration (the extension tube that defines the frame size, the camera angle and distance from the rotation axis). In Figs. 4 and 5, W is the x side of the frame and H is the y side of the frame, the camera angle (␪ in Figs. 4 and 5) and the distance of the frame center from the rotation axis (x and x, respectively in Fig. 5c and d). This feature makes the system highly flexible and adaptable to each workpiece, but on the other hand, the application of [8] must be carefully adapted. For simplicity, the authors have chosen to consider the measuring volume as a cylinder following these assumptions: camera angle of ␲/4; center of the frame positioned on the rotation axis of the table; the base of the frame lying on the horizontal plane; neglectable depth of field. As shown in Fig. 4, the reference cylinder will have height (h) equal to the vertical projection of the y side of the frame (H) according to (3) and diameter (D) equal to the horizontal projection of the frame diagonal (D) according to Eq. (4): h = H sin ␲/4

 D=



(3)

2

W 2 + Hcos␲/4

(4)

Consequently, the diagonal cross section L0 of the cylinder can be estimated with the (5): L0 =



h2 + D2

(5)

The computation of W and H can be performed exploiting the magnification ratio for the 36 mm extension tube which was experimentally estimated by the authors and found to be equal to 1.873. Since the sensor size of the camera is 22.2 mm × 14.8 mm, the frame size with 36 mm extension tube will be: x side W = (22.2/1.873) = 11.856 mm and y side H = (14.8/1.873) = 7.904 mm. The estimated frame size, allows computing through the (4)-(5): D = 12.562 h = 5.589 mm, L0 = 13.749 mm. Once the measurement volume and diagonal of the cylinder have been estimated, [8] describes the size limits for the specimens shown in Fig. 7 to be used in the validation phase. In particular, the diameter of the sphere must be between 0.02 and 0.2 times L0 , the distance between the centres of the spheres of the ball bar must be greater than or equal to 0.3 × L0 . 2.3.3. Magnification ratio estimation The magnification ratio M has been estimated exploiting a simple sheet of graph paper, following a low-cost procedure, performed capturing the sheet from the zenithal position. On the resulting image two generic points are considered and their dis-

Please cite this article in press as: Gianluca P, et al. ‘Image analysis for 3D micro-features: A new hybrid measurement method. Precis Eng (2016), http://dx.doi.org/10.1016/j.precisioneng.2016.11.012

G Model PRE-6488; No. of Pages 10 6

ARTICLE IN PRESS P. Gianluca et al. / Precision Engineering xxx (2016) xxx–xxx Table 1 Coaxial cylinder sizes. cylinder

Nominal Diameter[mm]

Nominal Height[mm]

1 2 3 4

1 2 3 4

0.450 0.750 1.5 3

Fig. 6. Zenithal image of graph paper to estimate M;.

Fig. 7. Specimens manufactured with WireEDM: a) ball bar b) coaxial cylinders;.

tance, Dgr , measured with the aid of graph paper. Afterwards Dgr is compared, with the aid of a third party freeware image processing software, to its corresponding distance on the sensor, expressed in millimetres, Dsw (Fig. 6). More in detail the steps are: (i) capture of one graph paper sheet from zenithal position; (ii) through a freeware software for onscreen measurement and image capture, such as IC Measure, two points of the graph paper at a known distance are considered. Their distance in millimetres, Dgr , is used as 2D calibration input into the software; (iii) through the calibrated 2D software, the diagonal of the image Dsw is measured in millimetres; (iv) the magnification ratio is computed as: M = Dsen /Dsw where Dsen is the diagonal size of the sensor, as specified by the manufacturer: for Canon 400D, Dsen = 26.681 mm. This procedure was applied to the photogrammetric system proposed in this paper, resulting in M = 1.8725. 2.3.4. Specimen fabrication In order to evaluate the measurement performance of the closerange Photogrammetry equipment, when the new scale method is applied, one specimen was fabricated following [8]: one Ball-bar specimen presenting two spheres with a diameter of 2 mm and the distance between their centres equal to 4.5 mm, connected by a cylindrical rod having a diameter of 0.7 mm (Fig. 7a). An additional sample has been fabricated (Fig. 7b) exploiting a workpiece adopted in the literature [16,39–41], for evaluating the accuracy of the measuring system, consisting of four coaxial cylinders with sizes shown in Table 1: The specimens have been fabricated via Micro Electro Discharge Machining (␮EDM) adopting the wire EDM approaches. The ␮EDM process has been adopted because it can combine low, reflecting surface and high geometry accuracy. In fact, the process erodes the material from the workpiece by a sequence of discharges that generates a random superimposition of craters. The crater size strongly influences the surface roughness produced and adopting a very small energy discharge, few nanoJoules, the crater size can be

Fig. 8. Ball-bar manufactured with “WireEDM”.

smaller than 10 ␮m generating a surface roughness that can achieve values lower than Ra = 0.1 ␮m. The homogeneous distribution of the craters makes the eroded surface very suitable for the digital close range Photogrammetry strongly reducing the picture noise. A state of the art micro-EDM machine has been used (Sarix ␮EDM SX 200 – Fig. 8) to realize the two specimens. In particular, the machine is equipped with wire-micro-EDM unit (Arianne, on Sarix SX 200). The tool is a brass wire with a diameter of 0.2 mm. For both specimens, the final shape has been obtained starting from a hardened steel rod with a diameter of 5 mm, by eroding the steel rod to an approximation of the final shape using a roughing regime, leaving a stock allowance of 10 ␮m, while the final shape has been obtained in finishing regime adopting a nano Joules energy level. 2.3.5. Specimen certification The Specimen sizes were certified by Sitec, Certification laboratory permanently accredited by Accredia, the official italian accreditation body, with license no 196 for length measurements. In Tables 2 and 3 the certified diameters and center-to-center distance for the ball-bar and the certified diameters and heights for the coaxial cylinders are shown. The laboratory certified the expanded uncertainty U, with a confidence level equal to 95%, equal to: U = (3 + 2 x L)␮m

(6)

where L is the nominal length expressed in metres. Eq. (6) and the low values of sizes lead to neglect the contribution of L and approximate the standard uncertainty to 3 ␮m. The measurement

Please cite this article in press as: Gianluca P, et al. ‘Image analysis for 3D micro-features: A new hybrid measurement method. Precis Eng (2016), http://dx.doi.org/10.1016/j.precisioneng.2016.11.012

G Model PRE-6488; No. of Pages 10

ARTICLE IN PRESS P. Gianluca et al. / Precision Engineering xxx (2016) xxx–xxx

7

Table 2 Certified diameters and center-to-center distance for the ball-bar.

uncertainties stated in the certificates are the expanded uncertainties with k = 2, i.e. the double of the standard deviation, according to ISO 14025 (corresponding to a confidence level of about 95% in the case of normal distribution). 3. Results The validation of the measurement method has been performed according to the following workflow: (i) image capturing with smaller and larger apertures; (ii) image processing using Sobel method using 3rd party free software (iii) distance computation (pixels to mm); (iv) exploitation of Agisoft Photoscan to process images to achieve a 3D point cloud; (v) perform post-processing in Geomagic Control. Photogrammetric capturing is performed with a regular step ␣ that covers the 360◦ degrees, keeping unchanged, camera distance and tilt angle. The experimental set was: tilt angle ␪ = 45◦ , approximate camera distance z = 230 mm and step ␣ = 10◦ . The software AgisoftPhotoScan Pro version 1.2.3 has been exploited to align and compute the dense point cloud. After calibration the parameters required by Agisoft Photoscan have been set in the following way: (i) accuracy set to high, meaning that the software works with the images at the original size (medium accuracy setting causes image downscaling by factor of 4); (ii) image pair preselection disabled, with this setting no predetermined couples of images are selected; the software explores point matching between all the possible image couples, because a ring strategy is used in capturing images; (iii) no keypoint limit per photo; in image processing keypoints are scale and rotation invariant interest points, which correspondence, in more than one image, allows the subsequent elaboration of 3D features. Setting no limit maximizes the number of estimated corresponding points among the images, leading to the availability of data as maximum as possible; (iv) no use of masks, since arbitrary shapes must be retrieved. Masks specify the areas on the images which can otherwise be confusing to the software or lead to incorrect reconstruc-

Fig. 9. Ball-bar and cylinder positions into the measurement volume.

tion results; the texture of the specimens and the high contrast with the background avoid this need. (v) point cloud quality set to high, meaning that the images are downscaled by four times, to speed up elaboration times; (vi) no depth filtering while computing depth maps, in order to keep unaltered the information included in the images. These parameters do not depend on the shape of the workpiece and their input can be fully automated by a batch procedure, resulting in a method that can be fully exploited in industrial applications. However, the high elaboration time does not allow the exploitation to real-time applications, limiting it to off-line. Measurements according to the standard [8] are to be made in different orientations and positions of the specimen in the measurement volume, then seven measurements [8] for the ball-bar were performed (Fig. 9). Moreover, in order to deepen the investigation, the authors applied the same procedure to the cylinders, despite the fact that it is not prescribed by [8]. Measurements have been performed in a thermally stable lab [14,16,17,39,41–47], and the repeatability of the measurement process conditions as defined in [9,42–46], have been guaranteed through the same procedure, same operator, same measuring system, same specimen, same conditions of use of the instrument and of the specimen in a short period of time. In Fig. 10, two examples of textured point clouds obtained with the proposed method are shown.

Table 3 Certified diameters and heights for the coaxial cylinders.

Please cite this article in press as: Gianluca P, et al. ‘Image analysis for 3D micro-features: A new hybrid measurement method. Precis Eng (2016), http://dx.doi.org/10.1016/j.precisioneng.2016.11.012

G Model

ARTICLE IN PRESS

PRE-6488; No. of Pages 10

P. Gianluca et al. / Precision Engineering xxx (2016) xxx–xxx

8

Table 4 Ball-bar point cloud results according to photogrammetric technique. SUT

pos 1

pos 2

pos 3

pos 4

pos 5

pos 6

pos 7

Average

Certified

S1 Diameter (mm) S2 Diameter (mm) DISTANCE S1-S2 (mm) Number of points

1.9994 2.0103 4.4990 236324

1.9991 2.0029 4.4948 191031

1.9956 2.0001 4.4859 232547

1.9939 1.9914 4.4578 167151

1.9985 2.0038 4.4917 214934

1.9976 2.0049 4.4903 233784

2.0026 2.0140 4.5098 232021

1.9981 2.0039 4.4899 215399

2.0100 2.0140 4.4990

D3, the standard deviation has been indicated as ND due to the low number of samples involved in the computation. In Table 7 the errors on the diameter (PES) and on the heights (UPSE) are shown. 4. Discussion of results

Fig. 10. Example of tessellated and textured point clouds of the ball-bar and cylinders.

In order to reduce the effect of outliers, the guideline [8] states that for each surface fit it is possible to remove up to the 0.3% of the points. The maximum amount of points was removed by the authors from each surface to minimize the effect of outliers. Subsequently, two best-fit spheres were generated by the least squares method, for each point cloud belonging to each position, using the commercial software Geomagic Control. The results of the sphere dimensions measured with our methodology are presented in Table 4. In this table, for each different position of the specimen into the measurement volume, the S1 diameter, the S2 diameter, the Euclidean distance between the centres of S1-S2 and the number of points constituting the point cloud, are listed together with their average values and standard deviations. Afterwards, according to [8], the S1 and S2 diameters computed with the SUT were subtracted from the reference values of the calibrated ball-bar (Table 2) and shown in Table 5 as Probing Errors in Size for S1 and S2 (respectively PES S1 and PES S2) in the several positions. At the same time and with the same methods, the centres of S1 and S2 have been computed and their distance estimated (Table 5-SSE, Sphere Spacing Error). With the aid of the coaxial cylinders specimen, shown in Fig. 10b, the authors further investigated the performance of the scaling method. The concept of PES can be considered as corresponding to the cylinder diameter, while the SSE has been considered as corresponding to the Unidirectional Plane Spacing Error (UPSE), [16,39–41], computed as the difference in the height of each cylinder, measured with the SUT and the reference system. The heights are measured by fitting a plane to the point cloud of the two bases. Since the two planes have a geometric tolerance on parallelism, the distance between one random point on one basis and the other basis is computed. This procedure is repeated ten times and the average of the distances is the value in Table 6. Table 6 shows that in some cases, it is not possible to adapt the best-fit cylinders due to the high level of noise in the point cloud in some peripheral areas. The word ND means that the point cloud related to the correspondent cylinders and positions was not defined enough to allow a cylinder fitting. With regards to PES D1-

Analyzing Table 4 it can be seen that the standard deviation is 2.8 ␮m for S1 while it is 7.3 ␮m for S2, indicating a more stable measurement on S1 with respect to S2. These considerations may be associated with the location of S2 which is always positioned in volume portions that are more peripheral, if compared to S1. S2 is always near the lower limit of the measurement volume, and occupies external regions of each frame. With difficult light conditions, it achieves a slightly more noisy point cloud than S1. Lower stability of the measure is stated by the distance S1-S2, with standard deviation equal to 16.1 ␮m. The increase of this value is due to the procedure of computing, such a distance, based on four estimations rather than a single one, namely the two sphere diameters, and the two positions of the centres of the spheres. The results can be considered satisfying, also considering that the position 4 has an abnormal behaviour when compared to the other positions. In fact, the values of S1, S2 and distance S1-S2 are abnormal values with a confidence level equal to 99%. It is important to underline that the number of points constituting the point cloud is lower in case 4 than the other cases. The results in Table 5 show PES and SSE for the ball bar. The values are obtained by subtracting the measured values retrieved by the reference system, used for calibration, from the corresponding values of the SUT. The worst values detected are again in position 4 resulting in 16.1 ␮m for S1 diameter, 22.6 ␮m for S2 diameter and 41.2 ␮m for the distance S1-S2 (absolute values). Another important consideration must be made in paying attention to the sign of the errors, which are negative in all cases except one for SSE in position 7. This could lead to hypothesizing a systematic error induced by the scale: in increasing the scale the error should be smaller. This hypothesis is not confirmed by the coaxial cylinders, which produce better results in modulus but both with positive and negative values. In fact, in Table 7, the errors related to the cylinders are shown. All the values are lower than 10 ␮m, since the maximum absolute value is 8.8 ␮m regarding the UPSE. The reference system has a standard uncertainty, 3/2 = 1.5 ␮m approximately ten times lower than the worst standard uncertainty, estimated as the worst standard deviation of the measurements, namely the SSE = 16.1 ␮m. On the other hand, all the photogrammetric measurements on the coaxial cylinders lead to uncertainties very close to the calibration certificate. Consequently, the SUT and the reference system have similar uncertainties confirming the validity of the proposed method. 5. Conclusions In this paper, a hybrid 3D image processing method for measuring micro-features has been presented. The method consists of coupling Digital Close Range Photogrammetry with Depth from

Please cite this article in press as: Gianluca P, et al. ‘Image analysis for 3D micro-features: A new hybrid measurement method. Precis Eng (2016), http://dx.doi.org/10.1016/j.precisioneng.2016.11.012

G Model

ARTICLE IN PRESS

PRE-6488; No. of Pages 10

P. Gianluca et al. / Precision Engineering xxx (2016) xxx–xxx

9

Table 5 Ball-Ballbar errors compared to the scaled photogrammetric model. Errors (␮m)

pos 1

pos 2

pos 3

pos 4

pos 5

pos 6

pos 7

Average

Std. Dev.

ABS MAX

PES S1 PES S2 SSE

−10.6 −3.7 0.0

−10.9 −11.1 −4.2

−14.4 −13.9 −13.1

−16.1 −22.6 −41.2

−11.5 −10.2 −7.3

−12.4 −9.1 −8.7

−7.4 0.0 10.8

−11.9 −10.1 −9.1

2.8 7.3 16.1

16.1 22.6 41.2

Table 6 Point clouds related to the cylinders with the SUT (ND means impossibility for the software to elaborate). SUT

pos 1

pos 2

pos 3

pos 4

pos 5

pos 6

pos 7

Average

Certified

D1 (mm) D2 (mm) D3 (mm) D4 (mm) H1 (mm) H2 (mm) H3 (mm) H4 (mm) No of points

1.0028 2.0026 3.0013 4.0018 0.4501 0.7517 1.4758 3.0003 2.8 × 106

ND ND ND 4.0077 0.4512 0.7502 1.4780 3.0058 3.0 × 106

1.0042 2.0028 3.0020 4.0026 0.4494 0.7515 1.4748 3.0019 3.0 × 106

ND ND ND 4.0076 0.4470 0.7527 1.4761 3.0020 2.7 × 106

ND ND ND 3.9982 0.4479 0.7452 1.4702 2.9973 2.7 × 106

ND ND 3.0002 4.0041 0.4505 0.7499 1.4775 3.0054 3.0 × 106

ND ND 3.0050 4.0055 0.4493 0.7485 1.4783 3.0049 2.9 × 106

1.0035 2.0027 3.0021 4.0039 0.4493 0.7500 1.4729 3.0025 2.9 × 106

1.006 2.010 3.004 4.003 0.450 0.748 1.479 3.004

Table 7 Cylinders errors compared to the scaled photogrammetric model. Errors (␮m)

pos 1

pos 2

pos 3

pos 4

pos 5

pos 6

pos 7

Average

Standard Deviation

ABSMAX

PES D1 PES D2 PES D3 PES D4 UPSE H1 UPSE H2 UPSE H3 UPSE H4

−3.2 −7.4 −2.7 −1.2 0.1 3.7 −3.2 −3.7

ND ND ND 4.7 1.2 2.2 −1.0 1.8

−1.8 −7.2 −2.0 −0.4 −0.6 3.5 −4.2 −2.1

ND ND ND 4.6 −3.0 4.7 −2.9 −2.0

ND ND ND −4.8 −2.1 −2.8 −8.8 −6.7

ND ND −3.8 1.1 0.5 1.9 −1.5 1.4

ND ND 1.0 2.5 −0.7 0.5 −0.7 0.9

−2.5 −7.3 1.9 0.9 −0.7 2.0 −3.2 −1.5

ND ND ND 3.4 1.5 2.5 2.8 3.1

3.2 7.4 3.8 4.8 3.0 2.8 8.8 6.7

Focus. Photogrammetry is applied using a commercial software, while the authors focused their attention on a new scaling method based on Depth from Focus. The research work has been aimed to subsequently adapt and apply the guideline VDI/VDE 2634 to validate the method, exploiting specifically-designed workpieces manufactured by means of wire electro-discharge machining. The use of the guideline led to errors in sphere Probing Error in Size up to 22.6 ␮m (abs) and Sphere Spacing Errors up to 41.2 ␮m (abs). However, to add further considerations, the authors investigated the proposed system measuring cylinders rather than spheres, to further verify the proposed method. These results improved the rating of the SUT, since a maximum error equal to 8.8 ␮m (abs) was found, very close to uncertainty declared by the calibration certificate of the specimens. In conclusion the proposed system is very promising, its uncertainty must be further investigated considering new experimental configurations such as different lenses and extension tubes and higher resolution cameras.

References [1] Gallo A, Muzzupappa M, Bruno F. 3D reconstruction of small sized objects from a sequence of multi-focused images. J Cult Herit 2014;15(2):173–82. [2] Brajlih T, Tasic T, Drstvensek I, Valentan B, Hadzistevic M, Pogacar V, et al. Possibilities of using three-Dimensional optical scanning in complex geometrical inspection. Strojniˇski Vestn −J Mech Eng 2011;57(11):826–33. [3] Robinson A, McCarthy M, Brown S, Evenden A, Zou L. Improving the quality of measurements through the implementation of customised standards. In: Proc. 3rd Int. Conf. 3D Body Scanning Technol. 2012. p. 235–46. October. [4] Percoco G, Sánchez Salmerón AJ. Photogrammetric measurement of 3D freeform millimetre-sized objects with micro features: an experimental validation of the close-range camera calibration model for narrow angles of view. Meas Sci Technol 2015;26(9):095203. [5] Galantucci LM, Pesce M, Lavecchia F. A powerful scanning methodology for 3D measurements of small parts with complex surfaces and sub millimeter-sized features, based on close range photogrammetry. Precis Eng 2015;43:211–9.

[6] Percoco G, Lavecchia F, Salmeron AJS. Preliminary study on the 3D digitization of millimeter scale products by means of photogrammetry. Procedia CIRP 2015;33:257–62. [7] Atsushi K, Sueyasu H, Funayama Y, Maekawa T. System for reconstruction of three-dimensional micro objects from multiple photographic images. CAD Comput Aided Des 2011;43(8):1045–55. [8] V. Der, Vdi/Vde 2634, pp. 1–20, 2011. [9] Guidi G. Metrological characterization of 3D imaging devices. Proc SPIE 2013;8791:87910M. [10] M. Beraldin Jean-Angelo, Gaiani, Valutazione delle prestazioni di sistemi di acquisizione tipo 3D active vision: alcuni risultati, Publ. DDD − Riv. Trimest. Disegno Digit. e Des. Ed, Poli. Des., pp. pp. 115–128 2003. [11] Ahmadabadian AH, Robson S, Boehm J, Shortis M, Wenzel K, Fritsch D. A comparison of dense matching algorithms for scaled surface reconstruction using stereo camera rigs. ISPRS J Photogramm Remote Sens 2013;78:157–67. ´ [12] M. Grzelka, G. Budzik, L. Marciniaka, B. Gapinskia: Accuracy of the photogrammetric measuring system for large size elements. Archiwum Odlewnictwa Polish Academy of Science, The Katowice Branch Commission of Foundry Engineering, Archives of Foundry Engineering”, Vol. 11, Issue 2/2011; ISSN 1897–3310, s. 75–80. [13] Bathow C, Breuckmann B, Scopigno R. Verification and acceptance tests for high definition 3D surface scanners. 11th Int. Conf. Virtual Reality, Archaeol. Cult. Herit 2010:9–16. October. [14] Beraldin J-A. Digital 3D imaging and modeling: a metrological approach. Time Compression Technol Mag 2008:33–5. [15] S. Carmignato, E. Savio, and M. Verdi, Standardizzazione e verifica dell’ accuratezza dei sistemi ottici, pp. 1–6, 2004. [16] Guidi G, Russo M, Magrassi G, Bordegoni M. Performance evaluation of triangulation based range sensors. Sensors 2010;10(8):7192–215. [17] Angelo Beraldin J, Mackinnon D, Cournoyer L. Metrological characterization of 3D imaging systems: progress report on standards developments. Int Congr Metrol 2015;3:1–21. [18] Toschi I, Nocerino E, Hess M, Menna F, Sargeant B, Macdonald L, Remondino F, Robson S. Improving automated 3D reconstruction methods via vision metrology. Proceedings of SPIE - The International Society for Optical Engineering 2015, 9528, art. no. 95280H https://www.scopus.com/inward/ record.uri?eid=2-s2.084954178720&partnerID=40&md5=3661c057a41f70876ec96b7bd1fafb97. [19] Hess M, Robson S. 3D imaging for museum artefacts: a portable test object for heritage and museum documentation of small objects. Int Arch Photogramm Remote Sens Spat Inf Sci − ISPRS Arch 2012;39:103–8. [20] Hess M, Robson S, Hosseininaveh Ahmadabadian A. A contest of sensors in close range 3D imaging: performance evaluation with a new metric test

Please cite this article in press as: Gianluca P, et al. ‘Image analysis for 3D micro-features: A new hybrid measurement method. Precis Eng (2016), http://dx.doi.org/10.1016/j.precisioneng.2016.11.012

G Model PRE-6488; No. of Pages 10 10

[21] [22]

[23]

[24]

[25] [26] [27]

[28] [29]

[30] [31]

[32]

[33]

ARTICLE IN PRESS P. Gianluca et al. / Precision Engineering xxx (2016) xxx–xxx

object. ISPRS Arch Photogramm Remote Sens Spat Inf Sci 2014;XL-5(June):277–84. Robson Stuart ML, Angelo Beraldin, Andrew Brownhill. Artefacts for optical surface measurement. Can. Geotech. J 2011. Iuliano L, Minetola P, Salmi A. Proposal of an innovative benchmark for comparison of the performance of contactless digitizers. Meas Sci Technol 2010;21(10):105102. Luhmann T, WENDT K. Recommendations for an acceptance and verification test of optical 3-D measurement systems. Int Arch Photogramm 2000;XXXIII:493–500. Estler WT, Edmundson KL, Peggs GN, Parker DH. Large-Scale Metrology –An Update. CIRP Ann.- Manuf. Technol 2002;51(2):587–609, http://dx.doi.org/10.1016/S0007-8506(07)61702-8. ISSN 0007-8506. Laasonen M. Recognition of building parts from measured data. Measurement 2005. W.T. Estler, K.L. Edmundson, G.N. Peggs, and D.H. Parker, Large-Scale Metrology −An Update, vol. 1, no. 2. M. Fiani, F. Menna, and S. Troisi, Integrazione di tecniche di fotogrammetria laser scanning per la modellazione 3D della carena di una imbarcazione. Boll Soc Ital Fotogramm Topogr 1:39–58 (2008). Klein L, Li N, Becerik-Gerber B. Imaged-based verification of as-built documentation of operational buildings. Autom Constr 2012;21(1):161–71. González-Jorge H, Riveiro B, Arias P, Armesto J. Photogrammetry and laser scanner technology applied to length measurements in car testing laboratories. Meas J Int Meas Confed 2012;45(3):354–63. Luhmann T. Close range photogrammetry for industrial applications. ISPRS J Photogramm Remote Sens 2010;65(6):558–69. Abbas MA, Setan H, Majid Z, Chong AK, Lichti DD. Improvement in measurement accuracy for hybrid scanner. IOP Conf Ser Earth Environ Sci 2014;18:012066. Jiang R, Jáuregui DV, White KR. Close-range photogrammetry applications in bridge measurement: literature review. Meas J Int Meas Confed 2008;41(8):823–34. G.G.M. Russo, F. Remondino, Principali Tecniche E Strumenti Per Il Rilievo Tridimensionale In Ambito Archeologico, pp. 337–363, 2011.

[34] Cultural Heritage Imaging. Guidelines for Calibrated Scale Bar Placement and Processing; 2015. p. 1–11. [35] Agisoft LLC, Agisoft PhotoScan User Manual, p. 37, 2011. [36] Lowe DG. Object recognition from local scale-invariant features. Proc Seventh IEEE Int Conf Comput Vis 1999;2(8):1150–7. [37] Hartley R, Zisserman A. Multiple View Geometry in Computer Vision; 2004. [38] Koik BT, Ibrahim H. A literature survey on blur detection algorithms for digital imaging. 2013 1 st International Conference on Artificial Intelligence, Modelling and Simulation 2013:272–7. [39] Russo M, Magrassi G, Guidi G, Bordegoni M. Characterization and evaluation of range cameras. 9th Conf. Opt. 3-D Meas. Tech 2009:149–58. October 2015. [40] B.F. Beraldin Jean-Angelo, El-Hakim Sabry, Performance Evaluation Of Three Active Vision Systems Built At The National Research COUNCIL OF CANADA, pp. pp. 352–361, 1995. [41] Vagovsky J, Buransky I, Gorog A. Evaluation of measuring capability of the optical 3D scanner. Energy Procedia 2015;100(C):1198–206. [42] Beraldin J-A, Blais F, El-Hakim S, Cournoyer L, Picard M. Traceable 3D imaging metrology: evaluation of 3D digitizing techniques in a dedicated metrology laboratory. Proc. 8th Conf. Opt. 3-D Meas. Tech. Terr. laser scanning I 2007:310–8. October 2015. [43] Beraldin J-A, Carrier B, Mackinnon D, Cournoyer L. Characterization of triangulation- based 3D imaging systems using certified artifacts. NCSL Int Meas J Meas Sci 2012;7(4):50–60. [44] B. Carrier, D. Mackinnon, L. Cournoyer, and J. Beraldin, Proposed NRC Portable Target Case for Short-Range Triangulation-Based 3-D Imaging Systems Characterization, 2011. [45] MacKinnon D, Carrier B, Beraldin JA, Cournoyer L. GD&T-based characterization of short-range non-contact 3D imaging systems. Int J Comput Vis 2013;102(1–3):56–72. [46] Carrier B, MacKinnon DK, Cournoyer L. Performance evaluation of 3D imaging systems based on GD&T. Manuf Lett 2013;1(1):9–12. [47] Paakkari J, Moring I. Method for evaluating the performance of range imaging devices. Proc SPIE 1993;1821:350–6.

Please cite this article in press as: Gianluca P, et al. ‘Image analysis for 3D micro-features: A new hybrid measurement method. Precis Eng (2016), http://dx.doi.org/10.1016/j.precisioneng.2016.11.012