Robotics and Computer-Integrated Manufacturing 37 (2016) 49–56
Contents lists available at ScienceDirect
Robotics and Computer-Integrated Manufacturing journal homepage: www.elsevier.com/locate/rcim
Review
A performance evaluation methodology for robotic machine tools used in large volume manufacturing J.D. Barnfather a,b,n, M.J. Goodfellow a,b, T. Abram b a b
Rolls-Royce Plc. Civil Nuclear, PO Box 2000, Raynesway, Derby DE23 7XX, UK The Department of Mechanical, Aerospace and Civil Engineering, The University of Manchester, Pariser Building, Sackville Street, M13 9PL, UK
art ic l e i nf o
a b s t r a c t
Article history: Received 24 November 2014 Received in revised form 14 June 2015 Accepted 17 June 2015
The manufacture of large components in various industries can create health, safety and economic challenges as a result of machining operations. A potential solution is the replacement of conventional large machine tools with low-cost and portable robotic machine tools, although these lack accuracy and precision in comparison. Effort is therefore required to develop robotic machining technology to a state where it can be implemented in high tolerance applications using a variety of materials. A barrier to implementation is that there is not a standardised procedure available to robustly assess current robotic machining performance, which makes it challenging to assess the impact of technological developments. This paper develops a methodology to determine robotic machining performance based upon reviews of standards available that currently specify such guidelines for robotics and machine tool technologies used independently. It is found that useful elements from each theme can be combined and applied to robotic machine tool performance evaluation. These are presented here with the aim of forming a conceptual foundation on which the technology can be developed for large volume manufacturing. & 2015 Elsevier Ltd. All rights reserved.
Keywords: Machining Robot Standards Performance
Contents 1. 2. 3. 4. 5.
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Case study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Robot performance evaluation standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Machine tool performance evaluation standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 Proposed methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 5.1. Static performance evaluation methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 5.1.1. Test geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 5.1.2. Experiment format. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 5.1.3. Statistical analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 5.2. Dynamic performance evaluation methodology. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 5.2.1. Test geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 5.2.2. Statistical analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 6. Summary and conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
1. Introduction
n
Corresponding author at: The Nuclear Advanced Manufacturing Research Centre, The University of Sheffield, The Advanced Manufacturing Park, Brunel Way, Catcliffe, Rotherham S60 5WG, UK. http://dx.doi.org/10.1016/j.rcim.2015.06.002 0736-5845/& 2015 Elsevier Ltd. All rights reserved.
Production of large components in the oil and gas, aerospace, defence, marine, rail and energy sectors is highly capital intensive due to the requirement for large machine tools. For example,
50
J.D. Barnfather et al. / Robotics and Computer-Integrated Manufacturing 37 (2016) 49–56
pressure vessels in nuclear power plants can reach 20 m in height and 5 m in diameter, weighing up to 560 tons [1]. It is beneficial to develop low cost “process-to-part” robotic machining technology as an alternative to conventional machine tools. With robotic machine tools, there is potentially less capital investment requirement, lower lead times, no need for remounting parts after inspection for defect correction and less dependence on heavy lifting. A barrier to implementation is the dimensional errors associated with robotic machining, partly due to the relatively low dynamic stiffness and low resistance to machining forces [2,3]. Joint stiffness is specifically highlighted as being a influence on part quality by Dumas et al. This metric is difficult to obtain from robot manufacturers and user evaluation is recommended [4]. Geometrical errors in robot links and joints cause assembly misalignments and also influence positional accuracy, justifying robot specific kinematic model development for compensation, as discussed by Weill et al. [5]. Kinematic modelling challenges are faced for unconventional parallel robot structures as they are often complex and have many joints, despite their stiffness benefits [6– 8]. Gong et al. suggest that non-geometric robot errors should also be offset by accounting for thermal variations and joint flexibility under load [9]. Thermal concerns are supported by Kamrani et al. [10]. Olabi et al. highlight that trajectory planning is a key nongeometric contributor to path error [11]. This issue and robot feed rate accuracy, as assessed by Young and Pickin [12], are key research areas for improving machined surface quality. Conventional machine tool issues, including tool deflection, gear back lash and wear [11], are exaggerated in robotic machining due to structural differences [13]. Robot machining difficulties are widely covered in the literature, with further notable research documented in [14–28]. Adopting robotic machining technology is therefore dependent on these issues being overcome through research and development of methods to offset the dimensional errors. A standardised performance evaluation procedure is desirable to facilitate a universally credible and repeatable assessment of technology improvements [29]. A challenge faced in robotic machining is that it is not supported by international codes and standards, although robotics and machine tools are supported independently. This issue is highlighted in [30] where it is noted that machine tools standards are typically not applicable to robotic kinematic structures. This paper therefore reviews the performance evaluation standards available in the two fields to determine which elements can be best applied to the combination of technologies to assess errors unique to robotic machining. For example, instead of linear guideways and gantries there are unique arrangements of joints and actuators whose sensitivity to error is not necessarily exposed using non-specific test geometry. The outcome of this paper is a robotic machining performance assessment methodology, which will reduce the barrier to usage by providing a means of quantifying the machining tolerance range of a particular system. A performance assessment methodology is developed by reviewing robotics and machining standards, which allows for the evaluation in static and dynamic conditions. Static performance is considered, i.e. robot positional accuracy and precision without the effect of machining dynamics, because it allows a basic idea of performance to be gained at a low cost. Static assessment allows application suitability to be quickly identified if errors consume a large amount of the tolerance budget as this means it is unlikely that tolerances would be met with additional dynamic error sources. Static experiments also allow an insight to be gained into robot-specific error sensitivity, which can aid the selection of test geometry in machining performance studies.
This paper initially presents a case study in Section 2 to highlight the potential application of robotic machine tools in large component manufacturing. Robotics and machining standards are then reviewed in terms of procedures, theory and test geometry in Sections 3 and 4, respectively. A robotic machine tool performance evaluation methodology is then proposed by combining relevant standards in Section 5, with a final summary and conclusions being given in Section 6.
2. Case study To set the context for robotic machine tool performance evaluation methodology development, two hypothetical feature-level machining operations are considered on a large vessel. Firstly, instrument penetrations must be machined in the vessel. Secondly, in preparation for hydrostatic pressure testing, these penetrations are capped by with a welded plate, which must be machined off after the test [31]. Conventionally, instrument penetration and cap machining would be done on a horizontal milling machine sized to mount the entire vessel onto it, according to the following steps [32]: 1. 2. 3. 4.
Vessel lifting and orientation. Vessel mounting to machine tool. Execute machining operation. Reiterate Steps 1–3.
Alternatively, machining operations could be done using a robotic machine tool. This would only be sized to have a working envelope large enough to cover the individual features being machined and would not need to be built into a larger structure supporting the entire component. In this case, the following steps would be taken: 1. 2. 3. 4. 5.
Lift vessel into work zone. Secure vessel in place. Position robot in feature region. Execute machining operation. Reiterate Steps 3 and 4.
Conventional, the component is moved and reorientated on the machine tool in situations where there are multiple vessel features inaccessible from one direction. This requires multiple machine set-ups, which demand heavy lifting and the associated health and safety risks, which increase in severity with vessel scale. In robotic machining, only the robot is repositioned around the vessel to machine individual features. This can be achieved using large volume measurement systems and the use of additional programmable axes is not necessary. Robotic machine tools can be used in this way regardless of overall component scale, as they are just repositioned to the feature region of interest. Overall, robotic machining offers an opportunity for cost reduction, although structural differences mean that the standards used to assess machine tool performance are not applicable [30].
3. Robot performance evaluation standards A range of standardisation organisations offer robotics-based guidelines [33–39], covering issues surrounding applications far beyond those associated with industrial usage. Those that focus on industrial robotics address issues concerning the generic subsystems and components that could be applicable to a range of engineered systems as well as the following:
J.D. Barnfather et al. / Robotics and Computer-Integrated Manufacturing 37 (2016) 49–56
51
ISO 9283: Manipulating Industrial Robots Performance Criteria
Vocabulary Programming Health and safety Coordinate systems Mechanical interfaces Automation Communication Object handling User interfaces Control Integrated manufacturing systems Intelligence Design for modularisation Electrical equipment
and Related Test Methods [41]
ISO 9946: Manipulating Industrial Robots [42] VDI 2861 Parts 1–3 [43–45] ANSI/RIA R15.05 Parts 1–3 [46–48]
The standards most applicable to performance evaluation are those that address static positional error analysis [40]:
The relevant standards from ANSI specify methodologies similar to those in the primary robot performance evaluation standard offered by ISO in terms of the statistical techniques and test geometry, i.e. ISO 9283. The same can be said for the most relevant VDI standards. Overlap between standards organisations means that it is sufficient for a methodology to be developed from the key elements of ISO 9283. Note that there are also regional variations and translations of ISO standards, although they do not require individual consideration as they are fundamentally the same given that ISO represents over 160 member countries [49]. ISO 9283 specifies procedures, theory and geometry for conducting error analysis experiments and computing values for
Fig. 1. ISO 9283 test geometry.
52
J.D. Barnfather et al. / Robotics and Computer-Integrated Manufacturing 37 (2016) 49–56
accuracy and repeatability as well as other metrics. The test methods specified most relevant to low cost static error indication are those concerning positional error analysis. These methods involve directing the robot's tool centre point to various positions for 30 cycles at various percentages of maximum feed rate at 100% of the rated load in a temperature controlled environment and taking measurements at each position. However, for the purpose of robotic machining performance assessment, the feed rates selected should be representative of usage to better understand real world performance. Loading to 100% of the rated load may not represent machining conditions so it is unnecessary to apply any load greater than that experienced during machining. The standard also specifies the tool orientation to be perpendicular to the diagonal planes in a cuboid, although this is unnecessary for static proof of concept testing as the results required are achieved regardless of this. The geometry specified for these positions is shown in Fig. 1 and covers the maximum possible cuboid in the robots working volume with 10% subtracted from its diagonal distances to determine test position coordinates. However, it is only required that 5 poses are used, which is extremely simplistic considering that robot error may be position dependent. This suggests that the ISO 9283 experimental methodology is unlikely to capture performance reliably (Fig. 2). To gain a more complete initial insight into errors across the working volume, geometry should be adapted to test all cuboid corners as well as the centre point. High density error mapping would also give a more thorough study of configuration dependent error. It has also been observed by Agheli and Nategh [8] that dynamic models break down at the limits of a robots working envelope as the maximum errors are observed here, suggesting that best accuracy may be gained working in a lesser volume than is available. The threshold for accuracy break down is not quantified but the observation may suggest that using test geometry so close to the robot's working envelope edge may yield unrepresentative negative results. Further error mapping exercises are therefore justified to investigate errors across the working volume using a similar overall methodology but with a higher density of pose coordinates to more robustly understand performance. The statistical methods prescribed by ISO 9283 allow accuracy
and repeatability to be computed from experimental data, although guidelines are not given for studying performance drift. In ISO 9283, accuracy is defined as the difference between the mean measured and commanded coordinates and quantifies the systematic errors, which are constant and can be calibrated against. Systematic errors are classified into span errors that vary linearly over distance, zero errors that originate from differences between the true and real zero position of encoders, and non-linear errors that have non-linear relationship with a particular variable. However, there is no provision for identifying error types in ISO 9283. Repeatability is defined in ISO 9283 as the variance between measured coordinates and accounts for random errors, which are less easily calibrated against because influencing factors are not constant. Statistically, the theory specified in ISO 9283 is similar to that of gage repeatability and reproducibility studies [50], which assess variance of measurement results between multiple operators and workpieces. Whilst it could be argued that gage repeatability and reproducibility theory may be adaptable to robot performance analysis, it is more applicable to assessing whole measurement procedures used in production rather than assessing the equipment specifically. The ISO 9283 method of computing accuracy and repeatability is common across many areas of metrology and manufacturing performance evaluation, all of which are traceable to JCGM 100:2008 Evaluation of Measurement Data – Guide to the expression of uncertainty in measurement [51]. Traceability to JCGM 100:2008 means that adaptations are not necessary, although outlier detection and removal may be beneficial. Nevertheless, the equations used to calculate these metrics will be given in Section 5 when proposing the final performance evaluation methodology. In summary, this section gives an overview of the most relevant standard procedure for evaluating a robot's static systematic and random errors. Static error analysis is useful for indicating performance at low cost, and highlights areas where specific error sensitivities may be exposed with optimised test geometry. It may also be relevant to consider that general test and analysis concepts from ISO 9283 could be adapted and applied to investigations exploring the effect of a range of other factors influencing error in a non-resource intensive manor, e.g. robot configuration, performance drift and feed rate. Findings from the review of ISO 9283 therefore form a key element of the overall methodology developed.
4. Machine tool performance evaluation standards
Fig. 2. NAS 979 artefact.
To allow a full robotic machine tool performance evaluation methodology to be developed that takes into account dynamic errors, standardised guidelines available for conventional machine tools are reviewed. With machine tool standards, there is a similar situation to robotics in that there are a range available covering the general design issues and specific details of components and subsystems but less that are specifically related to performance evaluation, i.e. [33–39,52]. Out of those that are related to performance evaluation, a large proportion are machine specific or the equivalents of ISO standards. Bearing in mind that this review is motivated by the need to adapt performance evaluation methodologies to robotic machine tools, it is unnecessary to review variations from ISO in great detail but more important to apply the generic concepts to robotic machining. The standards from ISO are therefore used for this purpose. Initially considered is ISO 10791 – Test Conditions for Machining Centres. In Parts 1–3 [53–55], test methods are prescribed to evaluate the geometrical conformance to design specifications of conventional milling machines and do not propose methods for
J.D. Barnfather et al. / Robotics and Computer-Integrated Manufacturing 37 (2016) 49–56
evaluating machining error. Concepts here may be able to determine a robots build quality, but are largely irrelevant for performance evaluation methodology development. ISO 10791 Parts 1–3 are therefore not directly transferable to robotic applications. Other error analysis themed parts of ISO 10791 are closely based on those in ISO 230 – Test Code for Machine Tools [56–60], therefore making further consideration unjustified. However, when examining relevant parts of ISO 230, it is found that the procedures documented are primarily focused on assessing linear and rotary axes of machine tools. Whilst it could be argued that methods presented could be modified to suit robotic machining applications, ISO 9283 is effectively doing the same. The ISO 230 series and the standards closely related to it therefore do not add value as they do not consider dynamics and how axis errors relate to part errors. Procedures specified are therefore unable to be used to determine the base case performance of a robotic machine tool or for identifying other challenges, for example chatter, as highlighted in the work of Pan et al. [23]. Evaluation based on test piece machining using geometry relevant to robotics is therefore necessary. Standard artefacts available for machining performance evaluation include the ones specified in ISO 10971-7, although their geometry is focused on axis error exposure in small scale machining centres with an additional rotary axis. Another option is from the AIA/NAS 979 – Uniform Cutting Tests - NAS Series Metal Cutting Test Specifications [61] standard for machine tools, shown below. NAS 979 geometry is similar that of ISO 10971-7 but has more features and may expose errors more effectively. This artefact can be used to assess performance when machining various prismatic features in sub-regions of large components as it is made up of fundamental geometrical elements. It cannot be used to directly calibrate axes due to structural differences between machine tools and robots. Difficulty will be faced using the NAS 979 artefact when trying to assess ability to cut specific shapes as there would not be enough data to gain statistical confidence, meaning that error sensitivity to some applications can only be indicated on a basic level. Nevertheless, any artefact can be scaled to suit the working envelope of the robot under test to allow errors to be exposed across the working volume, although this may be undesirable economically. A more robust solution is to design an artefact to expose the robot's specific errors, i.e. span, zero or non-linear errors identified during static error mapping, and use it to perform experiments over the full positional and angular range to allow a more thorough understanding of machinable tolerances ranges. Using a custom designed artefact is beneficial because those specified in any standard favour non-machine specific geometry as a means of comparing machines. Also, they do not evaluate performance robustly as the performance metrics computed only relate to that particular artefact and are not representative of the tolerance ranges that they could machine other parts to. If the aim of robotic machining trials is to fully understand performance and compute a general performance metric that encompasses all machining variables, e.g. position in working envelope, feed rate, cut depth, etc., then it would be necessary to employ a Design of Experiments approach when machining a custom artefact [62]. This would facilitate systematic testing of each possible combination of variables to achieve statistical robustness and reliability in the performance metric computed. Testing to this extent would require significant time and monetary resource investment. Given that industrial interest in robotic machining is likely to originate from known, specific applications, it is recommended that the Design of Experiments approach is not taken. For a less resource-intensive assessment, technology should be assessed for particular tasks, i.e. large component features, rather than for general purpose. Geometry specific testing would also eliminate
53
the need for custom artefact design, whilst still providing the understanding necessary to assess technology developments that aim to counteract the inherent inaccuracies of robotic machining. Nevertheless, general indications of performance could be gained using the NAS 979 artefact, which is practically beneficial in screening industrial applications. Robotic machine tool performance evaluation should therefore be done using both the standard and the application specific geometry. Once machined, artefacts are measured and generic statistical techniques can be used to determine performance metrics for each to assess any difference and set a base case. Note that statistical robustness is gained here from the large sample size of final cuts, meaning that it is not necessary for a large batch of artefacts to be produced to conduct the study. A suitable standard for statistical analysis, which is independent of machine structure, is ISO 225143:2008 – Statistical Methods in Process Management – Capability and Performance: Machine Performance Studies for Measured Data on Discrete Parts [63]. This standard is focused on performance assessment for discrete parts in the short term rather than a characteristic monitored over an extended period of time, which is why process capability indices are not applicable. This section reviews relevant machine tool standards and highlights key elements applicable to the development of a performance evaluation methodology for robotic machine tools. This supplements the previous section by identifying ways to assess the sum of dynamic and static errors present during robotic machining processes and ultimately accumulate as deviation between a parts nominal and machined dimensions. The benefits of bespoke test piece design have also been highlighted as well as the associated practical limitations of using such designs to understand a robotic machine tool's general performance. Using a standard artefact and real feature geometry is therefore suggested to for machining trials. These observations will be used in the following section to compile a final robotic machine tool evaluation methodology.
5. Proposed methodology From the review and practical experience with the equipment, a robotic machine tool performance evaluation methodology is proposed to assess error accumulation in machined parts. This methodology ultimately serves to determine the base case performance of a robotic machine tool to facilitate the assessment of solutions developed to improve machinable tolerance range. Sections 5.1 and 5.2 therefore discuss the procedural and statistical steps for performance assessment aided by a laser tracker whose coordinate system has been aligned to the robots, in terms of static and dynamic tests, respectively. A prerequisite to conducting the recommended study is the definition of the tool centre point for each cutting tool used. Methodologies follow the procedures summarised in Figs. 3 and 4. 5.1. Static performance evaluation methodology The proposed methodology for static performance evaluation is documented in the following sub-sections and covers test
Fig. 3. Static evaluation procedure flowchart.
54
J.D. Barnfather et al. / Robotics and Computer-Integrated Manufacturing 37 (2016) 49–56
5.1.3. Statistical analysis In terms of statistical processing, pose accuracy, APP, is calculated using Eq. (1), where x¯ , y¯ and z¯ are the mean coordinates of the measured poses, i.e. the barycentres, given by Eqs. (2)–(4) with n being the number of measurement cycles and xj, yj and zj being the measured coordinates. These are compared with commanded pose coordinates, xc, yc and zc
AP P =
(x¯ − x c )2 + (y¯ − yc )2 + (z¯ − z c )2
(1)
n
x¯ = 1/n ∑ x j j=1
(2)
n
y¯ = 1/n ∑ y j j=1
(3)
n
z¯ = 1/n ∑ z j j=1
Fig. 4. Dynamic evaluation procedure flowchart.
geometry, experimental format and statistical analysis and serves as a basis for conducting static performance experiments to gain an initial idea of performance at low cost. For further reference, the annex of ISO 9282 provides practical guidance relevant to the method proposed and examples of a test report. 5.1.1. Test geometry The geometry in Fig. 5 should be used for robot programming. This geometry is adapted from ISO 9283 to move to 9 poses, P1 − 9, in the largest available cuboid in the working volume. For error mapping, the geometry can be expanded by adding equally spaced poses in each axis. For experimentation studying the effect of feed rate and performance drift, fewer poses can be used to minimise configuration dependent systematic error and therefore reduce noise.
Pose repeatability, RPl, is expressed as the radius of a sphere with the barycentre as the centre point, given by Eqs. (5)–(8). Here l¯ is the mean radius, i.e. the distance between the barycentre and measured coordinates, j. lj is therefore this distance for the jth measured coordinate and Sl is the standard deviation of radii giving the spread about the mean value, predicting 3s of measurements are within the value computed under the same conditions
RPl = l¯ + 3Sl
(5)
n
l¯ = 1/n ∑ l j j=1
lj =
Sl = 5.1.2. Experiment format To conduct initial positional performance experiments, the program should be run in a temperature controlled environment for 30 cycles using several feed rates with measurements taken at each pose. However, for error mapping only one cycle is necessary as the pose density would be higher and for performance drift experiments there would be many more to fully capture changes over time. For studying the effect of feed rate, 30 cycles should be run at many feed rate intervals spread over the full feed rate range of the robot to acquire a meaningful dataset.
(4)
(x j − x¯ )2 + (y j − y¯ )2 + (z j − z¯ )2
(6) (7)
2 ∑nj=1 (l j − l¯ )
n−1
(8)
These equations can also be applied to distance accuracy and repeatability calculation as well. This is done using the same data as for pose performance evaluation but using the differences between pose coordinates to specify commanded and measured distances for substituting the commanded and measured pose terms with. 5.2. Dynamic performance evaluation methodology This sub-section documents the methodology developed for performing a full evaluation of performance, that considers the sum of static and dynamic errors. In this case, only test geometry and statistical analysis are covered as, in machining tests, only one of each artefact is produced. Note that key dimensional data is acquired from machined artefacts using a CMM and that the procedures specified assume that the robot coordinate system has been aligned to the artefact stock material. A case study of the statistical analysis procedure can be found in the annex of ISO 22514-3 for further guidance.
Fig. 5. Alternative test geometry.
5.2.1. Test geometry As described in the previous section, it is beneficial to investigate the usage of several artefacts, although further insight could be gained by replicating these on a conventional machine tool to help determine technology state. These should be the NAS
J.D. Barnfather et al. / Robotics and Computer-Integrated Manufacturing 37 (2016) 49–56
979 artefact and real application geometry machined to produce a minimum of 30 dimensions for analysis. Programs can be written to perform machining using conventional CAM software to create an NC file, which can be post-processed to convert it into robot code. Such features are available as plug-ins on several commercial software packages. 5.2.2. Statistical analysis Processing data to compute a performance index in accordance with ISO 22514-3 begins with plotting the errors between nominal and measured dimensions over time with a run chart to assess stability, which is defined as being in a state of statistical control and subject only to random errors. Instability is therefore identified when stepped variations are seen on the run chart and corrected by rerunning the trial ensuring that all conditions are constant. Another consideration when reviewing the run chart is the presence of outliers, which are checked through individuals and moving range charts. A histogram is then plotted to assess the distribution class. If the error data is normally distributed, then the mean and standard deviation are plotted on the same chart, although if bimodal this would be shown on the control chart as steps and may justify rerunning the trial. If the distribution is skewed, then it must be transformed. A skewed distribution is checked for by constructing a probability plot, which indicates a non-normal distribution if a line cannot be tightly fitted to the plotted data. Additionally, conformance to specifications can be assessed from the fitted line by observing whether it crosses the upper and lower limits. If it does cross the limits and the data is normally distributed then the estimated percentage out of specification can be determined by looking up ^ the estimated lower and upper machine performance indices PmkU ^ and PmkL on Table A.1, Annex A, ISO 22514-3, which are summed to estimate total percentage out of specification. These are calculated using the following equations:
U − X¯ ^ PmkU = 3S
(9)
X¯ − L ^ PmkL = 3S
(10)
Here, U and L are the upper and lower specification limits, X¯ is the measurement mean and S is the standard deviation. If the data is non-normally distributed then these are calculated with
^ PmkU =
^ PmkL =
^ U − X50% ^ ^ X 99.865% − X50%
(11)
^ X50% − L ^ ^ X50% − X 0.135%
(12)
^ When the data is not normally distributed, the X terms are the estimated measurement values at the corresponding distribution percentile. The next stage of the statistical analysis is to calculate the performance indices. If measurement data is normally dis^ tributed, the estimated machine performance index, Pmk , is given by
U−L ^ Pm = 6S
(13)
55
^ In each case, Pmk the minimum machine performance index is defined as
{
^ ^ ^ Pmk = min PmkU , PmkL
}
(15)
Finally, approximate confidence intervals are calculated as follows for normally distributed data, where z is the standardised deviate for the normal distribution and N is the total sample size of measurements. If the distribution is not normal then it needs to be classified and then confidence intervals are calculated with reference to PD ISO/TR 22514-4:2007 Statistical Methods in Process Management – Capability and Performance – Part 4: Process capability estimates and performance measures [64] and BS ISO 21747:2006 Statistical Methods in Process performance and capability statistics for measured quality characteristics [65]. In addition, the method documented requires CMM uncertainty to be specified with results
^ Pm
X a2/2 N−1
,
2 ^ X1− a/2 Pm N−1
(16)
^ PmkL ± z1 − a /2
^2 PmKL 1 + 9N 2N − 2
(17)
^ PmkU ± z1 − a /2
^2 PmKU 1 + 9N 2N − 2
(18)
^ Pmk ± z1 − a /2
^2 PmK 1 + 9N 2N − 2
(19)
6. Summary and conclusions In summary, relevant standardised guidelines have been reviewed for robot and machine tool performance evaluation in an effort to design a methodology for assessing the accuracy and precision of a robotic machine tool from a procedural and statistical analysis perspective. This review has found that although there is not a standard available for robotic machine tool performance evaluation, those with most potential for adaptation to this application are ISO 9283, AIA/NAS 979 and ISO 22514-3, which would allow the benefits of both static and dynamic assessment to be exploited. Furthermore, issues surrounding the full exposure of robot specific errors have been discussed and the practical limitations of effectively doing this using a custom designed artefact have been noted, leading to a methodology being developed utilising a standard artefact and actual application geometry. Overall, the observations made and the methodology proposed allow robust conclusions to be drawn on the performance of robotics being used in machining applications and is therefore able to serve as a reference point for experimental work in this area in the absence of specific standards. An attempt has been made to promote the development of robotic machining techniques through the understanding of base case performance and subsequent performance improvements resulting from technology development efforts, which is a precursor to realising the associated benefits.
If measurements prove to be non-normally distributed, the machine performance index is given by
Acknowledgments
U − X¯ ^ Pm = ^ ^ X 99.865% − X 0.135%
The author of this paper would like to acknowledge Rolls-Royce Civil Nuclear and the Engineering and Physical Sciences Research
(14)
56
J.D. Barnfather et al. / Robotics and Computer-Integrated Manufacturing 37 (2016) 49–56
Council for the provision of funding and to the Nuclear AMRC for access to equipment and technical support. The views expressed in this paper are those of the authors and not necessarily those of the funding bodies or other organisations mentioned. References [1] Westinghouse, The Westinghouse Pressurized Water Reactor Nuclear Power Plant, Technical Report, Westinghouse Electric Corporation, Pittsburgh, 1984. [2] J. Pandremenos, C. Doukas, P. Stavropoulos, G. Chryssolouris, Machining with Robots: A Critical Review, in: (DET2011), 7th International Conference on Digital Enterprise Technology, Athens, Greece, 2011, pp. 614–621, ISBN 978-96088104-2-6. [3] C. Doukas, J. Pandremenos, G. Chryssolouris, D. Mourtzis, P. Stavropoulos, P. Foteinopoulos, G. Chryssolouris, On an empirical investigation of the structural behavior of robots, in: 45th CIRP Conference on Manufacturing Systems, vol. 3, 2012, pp. 501–506. http://dx.doi.org/10.1016/j.procir.2012.07.086. [4] C. Dumas, S. Caro, S. Garnier, B. Furet, Joint stiffness identification of six-revolute industrial serial robots, Robot. Comput. Integr. Manuf. 27 (4) (2011) 881–888, http://dx.doi.org/10.1016/j.rcim.2011.02.003. [5] R. Weill, B. Shani, Assessment of accuracy in relation with geometrical tolerances in robot links, CIRP Ann. Manuf. Technol. 40 (1) (1991) 395–399, http: //dx.doi.org/10.1016/S0007-8506(07)62015-0. [6] D. Kanaan, P. Wenger, D. Chablat, Kinematics analysis of the parallel module of the VERNE machine, in: 12th ITFoMM World Congress, 2007, pp. 1–6. [7] Z. Bi, Y. Jin, Kinematic modeling of Exechon parallel kinematic machine, Robot. Comput. Integr. Manuf. 27 (1) (2011) 186–193, http://dx.doi.org/10.1016/j. rcim.2010.07.006. [8] M. Agheli, M. Nategh, Identifying the kinematic parameters of hexapod machine tool, World Acad. Sci. Eng. Technol. 3 (52) (2009) 380–385. [9] C. Gong, J. Yuan, J. Ni, Nongeometric error identification and compensation for robotic system by inverse calibration, Int. J. Mach. Tools Manuf. 40 (14) (2000) 2119–2137, http://dx.doi.org/10.1016/S0890-6955(00)00023-7. [10] A.K. Kamrani, C.-C. Wei, H. Wiebe, Animated simulation of robot process capability, Integr. Manuf. Syst. 28 (1) (1995) 23–41, http://dx.doi.org/10.1108/ 09576069410056723. [11] A. Olabi, R. Béarée, O. Gibaru, M. Damak, Feedrate planning for machining with industrial six-axis robots, Control Eng. Pract. 18 (5) (2010) 471–482, http://dx. doi.org/10.1016/j.conengprac.2010.01.004. [12] K. Young, C.G. Pickin, Speed accuracy of the modern industrial robot, Ind. Robot: Int. J. 28 (3) (2001) 203–212, http://dx.doi.org/10.1108/ 01439910110389362. [13] P. Turek, J. Jedrzejewski, W. Modrzycki, Methods of machine tool error compensation, J. Mach. Eng. 10 (4), 2010, 5–25. 10.1016/j.procir.2013.06.078. [14] H. Wu, H. Handroos, J. Kovanen, A. Rouvinen, P. Hannukainen, T. Saira, L. Jones, Design of parallel intersector weld/cut robot for machining processes in ITER vacuum vessel, Fusion Eng. Des. 69 (1-4) (2003) 327–331, http://dx.doi.org/ 10.1016/S0920-3796(03)00066-8. [15] H. Wu, H. Handroos, P. Pessi, J. Kilkki, L. Jones, Development and control towards a parallel water hydraulic weld/cut robot for machining processes in ITER vacuum vessel, Fusion Eng. Des. 75–79 (2005) 625–631, http://dx.doi.org/ 10.1016/j.fusengdes.2005.06.304. [16] H. Zhang, J. Wang, G. Zhang, Z. Gan, Z. Pan, H. Cui, Z. Zhu, Machining with flexible manipulator: Toward improving robotic machining performance, in: Proceedings, 2005 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, 2005, pp. 1127–1132, http://dx.doi.org/10.1109/AIM. 2005.1511161. [17] A. Nubiola, I.a. Bonev, Absolute calibration of an ABB IRB 1600 robot using a laser tracker, Robot. Comput. Integr. Manuf. 29 (1) (2013) 236–245, http://dx. doi.org/10.1016/j.rcim.2012.06.004. [18] S. Zargarbashi, W. Khan, J. Angeles, Posture optimization in robot-assisted machining operations, Mech. Mach. Theory 51 (2012) 74–86, http://dx.doi.org/ 10.1016/j.mechmachtheory.2011.11.017. [19] S. Zargarbashi, W. Khan, J. Angeles, The Jacobian condition number as a dexterity index in 6R machining robots, Robot. Comput. Integr. Manuf. 28 (6) (2012) 694–699, http://dx.doi.org/10.1016/j.rcim.2012.04.004. [20] E. Abele, M. Weigold, S. Rothenbücher, Modeling and identification of an industrial robot for machining applications, CIRP Ann.: Manuf. Technol. 56 (1) (2007) 387–390, http://dx.doi.org/10.1016/j.cirp.2007.05.090. [21] P. Pessi, H. Wu, H. Handroos, L. Jones, A mobile robot with parallel kinematics to meet the requirements for assembling and machining the ITER vacuum vessel, Fusion Eng. Des. 82 (15–24) (2007) 2047–2054, http://dx.doi.org/ 10.1016/j.fusengdes.2007.06.012. [22] Z. Bi, L. Wang, Optimal design of reconfigurable parallel machining systems, Robot. Comput. Integr. Manuf. 25 (6) (2009) 951–961, http://dx.doi.org/ 10.1016/j.rcim.2009.04.004. [23] Z. Pan, H. Zhang, Z. Zhu, J. Wang, Chatter analysis of robotic machining process, J. Mater. Process. Technol. 173 (3) (2006) 301–309, http://dx.doi.org/ 10.1016/j.jmatprotec.2005.11.033. [24] S. Matsuoka, K. Shimizu, N. Yamazaki, Y. Oki, High-speed end milling of an articulated robot and its characteristics, J. Mater. Process. Technol. 95 (1–3) (1999) 83–89, http://dx.doi.org/10.1016/S0924-0136(99)00315-5. [25] J.S. Chen, W.Y. Hsu, Design and analysis of a tripod machine tool with an integrated Cartesian guiding and metrology mechanism, Precis. Eng. 28 (1) (2004) 46–57, http://dx.doi.org/10.1016/S0141-6359(03)00073-4. [26] Y. Li, H. Liu, X. Zhao, T. Huang, D.G. Chetwynd, Design of a 3-DOF PKM module for large structural component machining, Mech. Mach. Theory 45 (6) (2010) 941–954, http://dx.doi.org/10.1016/j.mechmachtheory.2010.01.008.
[27] D.A. Axinte, S. Abdul Shukor, A.T. Bozdana, An analysis of the functional capability of an in-house developed miniature 4-axis machine tool, Int. J. Mach. Tools Manuf. 50 (2) (2010) 191–203, http://dx.doi.org/10.1016/j. ijmachtools.2009.10.005. [28] S. Liu, D. Sun, C. Zhu, A dynamic priority based path planning for cooperation of multiple mobile robots in formation forming, Robot. Comput. Integr. Manuf. 30 (6) (2014) 589–596, http://dx.doi.org/10.1016/j.rcim.2014.04.002. [29] P. Vichare, R. Mcnair, T. Lawrie, J. Thompson, A. Nassehi, Machine tool capability profiles for representing machine tool health, Robot. Comput. Integr. Manuf. 34 (0) (2015) 70–78, http://dx.doi.org/10.1016/j.rcim.2014.11.002. [30] M. Halaj, E. Kureková, Positioning accuracy of non-conventional production machines—an introduction, In: Proceedings of XIX IMEKO World Congress, 2009, pp. 2099–2102. [31] H. Al-Gahtani, A. Khathlan, M. Sunar, M. Naffa'a, Local pressure testing of spherical vessels, Int. J. Press. Vessels Piping 114–115 (1) (2014) 61–68, http: //dx.doi.org/10.1016/j.ijpvp.2013.12.004. [32] C. Freeman, R. Scott, S. Reddish, Discrete event simulation in immersive virtual reality, in: Proceedings of the Fourth Joint Virtual Reality Conference ICATEGVE-EuroVR, Eurographics Association, Madrid, Spain, 2012. [33] ISO Standards Catalogue, 2014, URL 〈http://www.iso.org/iso/home/store/cata logue_ics.htm〉. [34] ANSI Standards Store, 2014, URL 〈http://webstore.ansi.org〉. [35] BSI Standards Catalogue, 2014, URL 〈http://shop.bsigroup.com〉. [36] ASME Standards Catalogue, 2014, URL 〈https://www.asme.org/shop/stan dards?cm_re¼ EngineeringTopics-_-GlobalHeader-_-Standards〉. [37] ASTM International—Standards and Publications, 2014, URL 〈http://www. astm.org/Standard/standards-and-publications.html〉. [38] VDI Standard Database, 2014, 〈http://www.vdi.eu/engineering/vdi-standards/ 〉. [39] JSA Web Store, 2014, URL 〈http://www.webstore.jsa.or.jp/webstore/Top/in dexEn.jsp〉. [40] H.V. Brussel, Evaluation and testing of robots, CIRP Ann.: Manuf. Technol. 39 (1990) 657–664, http://dx.doi.org/10.1016/S0007-8506(07)63002-9. [41] BS EN ISO 9283:1998 Manipulating Industrial Robots—Performance Criteria and Related Test Methods, 1998. [42] BS EN ISO 9946:1999 Manipulating Industrial Robots, 1999. [43] VDI 2861 Blatt 1:1988-06 Assembling and Handling; Characteristics of Industrial Robots; Designation of Coordinates, 1988. [44] VDI 2861 Blatt 2:1988-05 Assembling and Handling; Characteristics of Industrial Robots; Application-Related Characteristics, 1988. [45] VDI 2861 Blatt 3:1988-05 Assembling and Handling; Characteristics of Industrial Robots; Testing of the Characteristics, 1988. [46] ANSI/RIA R15.05-1-1990 (R1999) Evaluation of Point-to-Point and Static Performance Characteristics of Industrial Robots and Robot Systems, 1999. [47] ANSI/RIA R15.05-2-1992 (R1999) Industrial Robots and Robot Systems—PathRelated and Dynamic Performance Characteristics—Evaluation, 1999. [48] ANSI/RIA R15.05-3-1992 (R1999) Industrial Robots and Robot Systems—Reliability Acceptance Testing—Guidelines, 1999. [49] ISO, ISO Membership Manual, 2013. [50] A. Sahay, Measurement system analysis, gage repeatability and reproducability study, in: Six Sigma Quality: Concepts and Cases, 2010 (Chapter 7). [51] BIPM, JCGM 100:2008 Evaluation of Measurement Data—Guide to the Expression of Uncertainty in Measurement, 2008. [52] GOST Standards Catalogue, 2014, URL 〈http://www.gost.ru/wps/portal/pages. en.StandartCatalog〉. [53] BS ISO 10791-1:2012 Test Conditions for Machining Centres—Part 1: Geometric Tests for Machines with Horizontal Spindle, 2012. [54] BS ISO 10791-2:2001 Test Conditions for Machining Centres—Part 2: Geometric Tests for Machines with Vertical Spindle or Universal Heads with Vertical Primary Rotary Axis, 2001. [55] BS ISO 10791-3:1998 Test Conditions for Machining Centres—Part 3: Geometric Tests for Machines with Integral Indexable or Continuous Universal Heads, 1998. [56] BS ISO 230-1:2012 Test Code for Machine Tools Part 1: Geometric Accuracy of Machines Under No-Load or Quasi-static Conditions, 2012. [57] BS ISO 230-2:2012 Test Code for Machine Tools Part 2: Determination of Accuracy and Repeatability of Positioning of Numerically Controlled Axes, 2012. [58] BS ISO 230-4:2005 Test Code for Machine Tools Part 4: Circular Tests for Numerically Controlled Machine Tools, 2005. [59] BS ISO 230-6:2002 Test Code for Machine Tools Part 6: Determination of Positioning Accuracy on Body and Face Diagonals (Diagonal Displacement Tests), 2002. [60] PD ISO/TR 230-9:2005 Test Code for machine Tools Part 9: Estimation of Measurement Uncertainty for Machine Tool Tests According to Series ISO 230, Basic Equations, 2005. [61] AiA/NAS NAS 979 1969 Uniform Cutting Tests—NAS Series Metal Cutting Equipment Specifications, 1969. [62] J. Antunes Simões, T. Coole, D. Cheshire, A.R. Pires, Analysis of multi-axis milling in an anthropomorphic robot, using the design of experiments methodology, J. Mater. Process. Technol. 135 (2–3) (2003) 235–241, http://dx. doi.org/10.1016/S0924-0136(02)00908-1. [63] BS ISO 22514-3:2008 Statistical Methods in Process Management Capability and Performance—Part 3: Machine Performance Studies for Measured Data on Discrete Parts, 2008. [64] PD ISO/TR 22514-4:2007 Statistical Methods in Process Management—Capability and Performance—Part 4: Process Capability Estimates and Performance Measures, 2007. [65] BS ISO 21747:2006 Statistical Methods Process Performance and Capability Statistics for Measured Quality Characteristics, 2006.