Accepted Manuscript A fuzzy temporal approach to the Overall Equipment Effectiveness measurement Laurent Foulloy, Vincent Clivillé, Lamia Berrah PII: DOI: Reference:
S0360-8352(18)30583-7 https://doi.org/10.1016/j.cie.2018.11.043 CAIE 5533
To appear in:
Computers & Industrial Engineering
Received Date: Revised Date: Accepted Date:
10 April 2018 23 October 2018 22 November 2018
Please cite this article as: Foulloy, L., Clivillé, V., Berrah, L., A fuzzy temporal approach to the Overall Equipment Effectiveness measurement, Computers & Industrial Engineering (2018), doi: https://doi.org/10.1016/j.cie. 2018.11.043
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Title A fuzzy temporal approach to the Overall Equipment Effectiveness measurement Authors Laurent Foulloy, Vincent Clivillé, Lamia Berrah Laboratoire d’Informatique, Systèmes, Traitement de l’Information et de la Connaissance, LISTIC, University Savoie Mont Blanc, BP 80439, 74944 Annecy, France. Author details Laurent Foulloy graduated from the Ecole Normale Supérieure de Cachan in 1980. He received Ph.D. and D.Sc. degrees, both from Paris XI University, France, in 1982 and 1990, respectively. He is currently Professor of Electrical Engineering at the University Savoie Mont Blanc in the Computer Science, Systems, Information and Knowledge Processing Laboratory (LISTIC). At the University Savoie Mont Blanc, he was the Vice-President for Research from 1999 to 2004 and the head of Polytech Annecy-Chambéry, the graduate school of Engineering, from 2006 to 2017. His research interests include fuzzy logic, possibility theory for information processing in instrumentation and control. Vincent Clivillé graduated from the Ecole Normale Supérieure de Cachan (France) in 1982. He received PhD degree from the University of Savoie (France) in 2004. He is currently Associate Professor of Industrial Engineering at the University Savoie Mont Blanc, in the Computer Science, Systems, Information and Knowledge Processing Laboratory (LISTIC). His research interests include performance expression, performance measurement systems, industrial improvement approach and information fusion, namely the MultiCriteria Decision Aiding methods using MAUT. Lamia Berrah graduated from The Ecole Nationale Polytechnique d’Alger (Algeria) in 1991. She received PhD degree from the Institut National Polytechnique de Grenoble (France) in 1997. She is currently Associate Professor of Industrial Engineering at the University Savoie Mont Blanc, in the Computer Science, Systems, Information and Knowledge Processing Laboratory (LISTIC). Her research concerns objectives declaration, performance expression frameworks, industrial improvement approach and information fusion using fuzzy techniques and MultiCriteria Decision Aiding methods. Highlights Overall Equipment Effectiveness performance indicators are recalled. Temporal model for expressing instantaneous, trend and predictive performances is proposed. Computation of numeric expressions and their visual representation are detailed. French SME is used as a test case to emphasize industrial insights.
A fuzzy temporal approach to the Overall Equipment Effectiveness measurement
Abstract - This study deals with the Overall Equipment Effectiveness (OEE) subject and its associated performance expressions. The aim of the developed research is to provide useful pieces of information for the decision-making when looking for the improvement of such a productivity indicator. As an extension of literature and industrial practice of the OEE proposals, we propose here to focus on a temporal performance expression model that provides the achievement degree of the considered objective in respectively: an instantaneous way, a trend way and a predictive way. The basic assumption of this work is that the way of expressing performance depends on the nature of the actions that are associated with the considered objective achievement, be they either short-term reactive actions or long-term planned actions, which particularly characterises the OEE. Thus, short-term reactive control will be based on instantaneous performance expression, while trend performance expressions are proposed for observations, adjustments and planifications, leading moreover to providing some elements concerning the predictive performance expressions in the long-term. A fuzzy processing, whose purpose, according to the visual management principles, is the expression of both numerical and symbolic information, is developed. The proposed model is illustrated by the Production stops reduction case of the Fournier Company, by considering the daily as well as the weekly and semi-annually OEE practice of the company. Some concluding remarks will finally allow the relevance of the proposed model and the extensions to bring to it to be analysed.
Keywords - Industrial improvement, Temporal performance expression, Overall Equipment Effectiveness (OEE), Visual management, Fuzzy processing, Production stops reduction in the Fournier Company.
1
Introduction
Being continuously looking for the achievement of the assigned objectives of the considered productive system, industrial control consists of a permanent iteration of the “Plan - Do - Check - Act” cycle. Dealing thus with the Deming wheel philosophy (Imai1986; Ohno1988), such a control adopts the improvement concept and is based,
permanently, on the plan of the relevant actions knowing the objectives to be achieved and the already reached results. That is, in our opinion, the most intuitive and essential way of defining the control.
From this perception, one can deduce the fundamental pieces of information that are required for the control purpose of any productive system: the objective, that is the expected state, on the one hand, and the measurement, that is the reached state according to the launched actions, on the other hand. By subscribing to the control loop principle, Performance Indicators (PI’s)1 (Fortuin1988) as well as Performance Measurement Systems (PMS’s)2 (Neely et al.1995; Ghalayini et al.1997) are the methodological tools for linking together objectives, actions and measurements, thus ensuring the so-called “controlability” condition (Ducq et al.2001). Such a condition can be modeled by the automatic control loop (Jelali2013) (Fig. 1).
Fig. 1. The control loop as a way of defining the controlability principle (Berrah2002). PI’s and PMS’s are the frameworks introduced for providing different forms of the so-called “performance expressions”. While PI’s provide elementary performance expressions, PMS’s provide aggregated overall ones. Whatever its form, measurement: evaluation, satisfaction, numeric, linguistic, symbolic... a performance expression is associated with an objective and informs about the objective achievement (Berrah et al.2000; Bitton1990; ISO90002000), leading thus to present the right information for deciding on the actions to launch (Browne et al.1997; Mari2003). Under the Taylorian assumptions, the aim of the control was the maximisation of both the equipment effectiveness and the Direct Labour productivity (Taylor1911; Anthony1965; Johnson1975). Given the data of this context: 1
Let us recall that a PI is “a variable indicating the effectiveness and/or the efficiency of a part or whole of the process or system against a given norm/target or plan” (Fortuin1988). 2 Let us recall that a PMS can be seen as “a multi-criteria instrument, made of a set of performance expressions (also referred to as metrics by some authors (Melnyk et al.2004), i.e. physical measures as well as performance evaluations, to be consistently organized with respect to the objectives of the company” (Clivillé et al.2007).
the monocriterion dimension of the performance;
the simplicity of the involved bills of materials;
the repetitive and well-handled aspect of the manufacturing sequences;
the mechanical specificity of the equipment;
the specialisation of the Direct Labour…
improvement actions were structural, easily defined and executed. Only checking results were thus required to verify that everything was going well (Cosmetatos and Eilon1983; Kanigel2005). In this context, PI’s providing the right information were essentially the productivity numerical ratios that are expressed a posteriori of the execution of the processes. In the well-known current context, the aim of the control is always related to the maximisation of economic benefits but overall:
The multicriteria expression of performance leads to diversified and changing objectives (Cross and Lynch1988; Diakoulaki et al.1992; Bititci et al.2001; Nudurupati et al.2011).
Industrial systems as well as product structures are complex (Vernadat1996; Schlick and Demissie2016), inducing numerous analyses before defining relevant improving actions.
The associated temporal horizons are handling interrelated and diversified actions (Kaplan and Norton1992; Noble and Lahay1994).
The actions are of a different nature, being either structural, relating to verification semantic, or conjunctural and relating to reactive semantic.
Visual management (Steenkamp et al.2017) and advanced technologies (Hwang et al.2017) lead to diversified forms for providing performance expressions.
Then, PI’s providing the right information are plural and interrelated and temporally-dependent, operating as PMS’s, a priori as well as a posteriori of the execution of the processes (Kaplan and Norton1992; Ghalayini et al.1997). While a posteriori performance expressions concern the instantaneous verification of the achievement of the objectives, a priori expressions handle trend and potentially predictive information about such an achievement (Ducq et al.2001; Foulloy and Berrah2015). Moreover, performance expressions that are provided as the right information at the right moment adopt more than the numerical conventional Taylorian format. They can also be, according to the management requirements, linguistically or symbolically described (Clivillé et al.2014; Micheli and Mari2014). The Manufacturing Execution Systems (MES) functionalities (Blokdyk2017)
improving the availability of the measurements, such “temporal” and “multi-format” dimensions of the performance expression are becoming simpler and almost natural.
Hence, beyond the verification purpose, one major information requirement of the contemporary control is related to proactivity, leading thus to get the right information at the right moment in the right format. Dealing with this aspect, this paper concerns a temporal and multi-format vision of the performance expression and its application to the famous Overall Equipment Effectiveness (OEE) PI (Nakajima1988). Illustrating the proposed approach by the daily, weekly and semi-annually practice of the OEE in the Fournier Company, the developed idea subscribes to the definition of a temporal performance expression model that, according to the decision level OEE control requirements, mixes together a posteriori and a priori information, and numerical and symbolic formats. Beyond the classical and numerous data associated with the OEE, such a framework offers to managers not only an overall complete vision of the OEE evolution, but also few brief pieces of information, with different semantics depending on whether they are of an instantaneous, trend or predictive nature. This leads them to a quick and an easy analysis about what is happening with regards to the efficiency and thus leads to more reactivity. Indeed, being associated with a permanent overall objective related to the productivity increase, the OEE is devoted to the effectiveness numerical measurement of one or several pieces of equipment of the considered productive system and involves a set of interacting criteria. Managers periodically inform themselves about the returned performance expressions, in terms of effectiveness as well as in terms of stops. They react thus quickly to the stops occurrences, on the one hand, and slowly plan effectiveness equipment improvement on the other hand. Moreover, if the interest of instantaneous and trend information is well-appreciated in the famous Total Productive Maintenance (TPM) approaches (Nakajima1988; Muchiri and Pintelon2008; Alsyouf2007,) the question of the predictive maintenance is addressed in less detail. However, predicting the OEE evolution allows managers to better anticipate the capacity planning. By its semantics and scope, the OEE remains a decision-aiding tool for short-term reactive control as well as for long-term proactive planned improvement (Jonsson and Lesshammar1999; McCarthy and Rich2004). For all these reasons, it can be interesting to go beyond its conventional numerical measurement and enrich the latter by the temporal dimension as previously mentioned, by distinguishing, moreover, performance expressions format according to whether they are used for short-term control or long-term control.
This paper is organised as follows. Section 2 is dedicated to the OEE notion, its definition and practice. Reacting to the brief analysis, the relevance of using a multi-format temporal performance expression, i.e. a performance expression built on instantaneous expression, trend expression and predictive expression, and that can be expressed numerically as well as symbolically, is argued. The major principles of such a model are addressed in Section 3 as a reminder of what has been previously developed. Fuzzy visual display formal model is presented in Section 4. The focus will then be made on its application to the Fournier Company case in Section 5. Some prospects will finally conclude this rather new industrial vision and its application to the OEE. 2
The Overall Equipment Effectiveness (OEE)
2.1
Basics
In the 1970’s, S. Nakajima introduced the Overall Equipment Effectiveness (OEE) in the Japanese Industry, in the context of the Total Productive Maintenance program mentioned above. The purpose of this PI was to inform the Maintenance_department about the effectiveness of the equipment under consideration, leading it to diagnosis the result and then to plan and execute the adequate corrective actions (Steinbacher and Steinbacher1993). In this sense, the OEE is defined, for a considered equipment and a reference period.
The equipment that is concerned by the OEE can be either a “piece of equipment” of the observed productive system (machine, production cell, line…) or its “overall equipment”.
The information provided by the OEE can be given at variable periods (shift time of labour, a day, a week or a longer duration), according to the TPM or any other associated improvement approach requirements.
Implicitly handling the objective, the OEE provides thus a numerical measurement, which is obtained by comparing, (AFNOR2002; Ahuja2009; Shirose1997):
the Useful time which is the time devoted to the Actual production, namely the production of the compliant outputs,
to the Planned production time which is the time theoretically devoted to the Theoretical production, namely the production under optimal conditions.
Thus, the OEE measurement is computed by the following ratio:
Useful time Planned production time
(1)
Moreover, if the Theoretical cycle time identifies the product processing duration (McCarthy and Rich2004), then:
Useful time Actual production Theoretical cycle time Actual production Planned production time Theoretical production Theoretical cycle time Theoretical production
(2)
According to one side or the other side of the equation, the information pointed out will be focusing on the equipment yield in one case and on the productivity dimension in the other case. Given its semantic, the OEE indirectly informs on the considered equipment stops. Several approaches are proposed for detecting these stops or losses and analysing their causes (Shen2015; Relkar and Nandurkar2012; Stadnicka and Antosz2018), before classically ranking them according to a Pareto hierarchisation. Initially, centred on the Maintenance function, the TPM recommended detecting the famous “six big losses” that later evolved towards more detailed detections3. Stops are thus categorised according to respectively: “1-Equipment failure, 2-Setup and adjustment loss, 3-Idling and minor stoppage, 4-Reduced speed, 5-Defects in process, 6Reduced yield” (Badiger and Gandhinathan2008; Tajiri and Goto1992).
In the same logic, the French Standardisation Association AFNOR proposes four main losses categories (AFNOR2002). The first category deals with the Planned stops, which identify the controlled stops that come back systematically (preventive maintenance for instance). The second category concerns the Unplanned stops, namely the different random or non-controlled stops (breakdown downtime for instance). The third category, qualified as Losses of performance, deals with the both deviations of the Theoretical cycle time and the minorstoppages (the stops whose duration is less than a few minutes). The last category, qualified of Losses of Quality,
In this spirit, the Kaizen philosophy has introduced the “16 major losses”: “Breakdown loss, Setup loss, Minor interruption loss, Speed loss, Defects and rework loss, Start up loss, Tool changeover loss, Shutdown loss, Production stoppage loss, Line organization, Measuring and adjust loss, Management loss, Operations motion loss, Yield loss, Consumables loss, Energy loss”. Overcoming the simple Maintenance point of view, this vision includes more aspects of the company than the TPM concerning also the Development management, the Energy consumption, the Education and training (Rodrigues and Hatakeyama2006). 3
In his TPM synthesis, I. Ahuja proposes a more detailed vision identifying 23 types of losses: “Equipment Failure Loss (Parts Failure, Parts Adjustment), Setup and Adjustment loss, Tool Change Loss, Startup Loss, Minor Stoppage Loss, Reduced Speed Loss, Defect and Rework Loss, Scheduled Downtime (Cleaning and Checking, Planned Maintenance, Meetings), Management Loss (Waiting For Spares and Tools, Waiting For Instruction, Waiting For Material, Waiting For Men, Waiting For Power, Waiting For Inspection), Output Achieved, Quantity under RT” (Ahuja2009). These types of losses are categorised according to 11 main stop causes.
includes all the waste times due to the non-compliance production (the time spent producing scraps for instance). Then the AFNOR associates respectively with each of these categories some major stop causes. “13 production stop causes” - which are in reality 12, the Own stops being only an aggregation of six other stop causes - are then globally proposed, also expressed in time, as stops (Fig. 2).
Fig. 2. Production times, Stop times and Stop cause times involved in the OEE [extracted from (AFNOR2002)].
According to this deployment, (1) can be detailed into respectively:
Planned production time Unplanned stops Loss of performance Loss of quality Planned production time
(3)
and (naturally, all the stop causes are not necessarily involved in the detected stops):
Planned production time Sum of the 13 stop causes Planned production time
(4)
With time, the use of the OEE has widely expanded to several departments of the company (Production_department, Quality_department, Engineering_department…), each department having its own actions for improving and possibly optimising it. Looking for improvement of the OEE has become today a permanent objective that is reached by reducing losses and minimising time stops (Lycke and Akersten2000; Muchiri and Pintelon2008; Mâinea et al.2010), often with the automotive industry progress in this field as a model (da Silva et al.2017).
Besides, according to the multicriteria specificity of the OEE, another vision of it is proposed, always dealing with diagnosis causal deployment and losses identification. The OEE relates, this time, to the product of respectively the Availability, the Performance and the Rate of Quality of the considered equipment (Nakajima1988; Dal et al.2000): Availability Performance Rate of Quality
(5)
According to the AFNOR typology: Availability
Run time Planned production time
Net time Run time Useful time Rate of Quality Net time Performance
(6)
Naturally and to conclude for the OEE definition, let us observe that all these approaches are equivalent and provide the same measurement of the OEE. The use of one or another of them handles only the considered management point of view, the culture of the company and the nature of the practiced improvement method. As far as we are concerned, we subscribe to the AFNOR proposition, leading us to have an OEE analysis that is sufficiently detailed on the one hand, and to be in coherence with what is practiced by French SME on the other hand.
From the information processing point of view, one can observe that, until the years 2000, the operators were manually collecting times and quantity data, while the managers were entering the data by using spreadsheets and softwares such as EXCEL to compute and display the OEE. One major advance today in the computation of the OEE is the automation of the collection of the involved data as well as the ratios computations (Singh et al.2013; Saha et al.2016; Hedman et al.2016). Such an automation has been widely facilitated by the MES’s, the OEE being one of the Key PI’s that are followed by these software applications, next to the availability equipment, the non-compliance rate and the order fulfillment (D’Antonio et al.2017). OEE measurements are provided in numerical format, with more or less precision, according to the will of the managers and to the available sensors. Let us mention in this context some investigations that have considered the uncertainty handling in the stops (Sonmez et al.2018), or alternative ways for computing the OEE with fuzzy aggregation models (Maran et al.2013). Besides, only a few works have explicitly focused on the temporal analysis of the
OEE measurement, namely trend and predictive expressions of it4. Even if the OEE is informing on the achievement of the objective related to the equipment effectiveness improvement and is a long-time period PI, only instantaneous measurements, a posteriori of the successive associated actions execution, remain explicitly provided, leading the managers to make their analysis by themselves. Moreover, in the TPM philosophy, the OEE being initially a diagnosis tool of the different occurring stops, it has been “thought” as a quasi-real time PI that deals with instantaneous corrective actions. The OEE becomes today not only simultaneously a short-term diagnosis tool but also a mid-term and long-term improvement tool. Widely enhanced by the MES’s potentialities, another vision of the OEE measurement could then subscribe to this point of view, explicitly associating, as mentioned before, to this measurement, past and predictive studies. That is what we will propose for the Fournier Company OEE.
2.2
The Fournier Company practice of the OEE
The Fournier Company (Thônes, France) is a SME that produces kitchens, bathrooms and storing closets. The company is organised into four manufacturing sites, which are located in the Haute-Savoie Region, in a perimeter of 20 kilometers. Such sites have been defined according to the different family products that are manufactured by the company. More than 850 000 items were manufactured on 2015 and the weekly production is approximately 5000 pieces of furniture. The Fournier Company offers a wide range (more than a billion) of products with a high degree of customisation and operates in a “make-to-order” format. The considered plant in this study is the one specially dedicated to produce bathroom furniture. Continuously growing, this activity reached in 2016 a production of more than 120 000 pieces of furniture. The main production line processes wood or fiberboards (Fig. 3). The line is constituted of 10 pieces of equipment, and gathers in its first part numerous automatised operations of machining, cutting, drilling, and, in its second part, manual assembly machinery. The lot size is reduced (only a few pieces). The Open time is 24 hours a day five days a week and the Planned production time is 16 hours a day five days a week.
Let us mention that in the predictive maintenance field, works dealing with temporal analysis of the OEE for have been however tackled (Kurscheidt Netto et al.2017). 4
Fig. 3. The Fournier Company Bathroom furniture line5. The Production_department controls the Bathroom furniture line in a conventional way, dealing with the visual management, the Kaizen and the Lean principles. The department considers the PMS that covers the following overall PI’s:
the Service rate, for improving customer satisfaction;
the Compliance rate, for verifying product quality;
a Safety index that deals with employees security;
and the OEE, for respectively reducing the time stops and increasing the effectiveness of the equipment.
In particular, the OEE is associated with the whole Bathroom furniture line, combining the 10 involved pieces of equipment. Practically, for each stop occurrence, the operators specify respectively the concerned piece of equipment as well as the observed stop according to the AFNOR four stops categories (see Fig. 2). In its turn, each detected stop is associated to the set of corresponding stop causes, always according to the OEE stops causes model of the line (Fig. 4). This model is a personalisation of the AFNOR definition, 10 stop causes being retained from the 12 stop causes proposed by the standard. Indeed, the company respectively regroups the Periodic setup and the Periodic maintenance into the Periodic setup and Maintenance and the Production changeover and the Tool changeover into the Changeover.
(4) becomes in this particular case:
Planned production time Sum of the 10 stop causes Planned production time
5
Fig. 3 is extracted internal reports of the Fournier Company.
(7)
where:
Sum of the 10 stop causes Induced stops Breakdown Changeover Measure Periodic setup and Maintenance Exploitation IIdling and minor stops Reduced speed Defects Reworks
Then, for each stop cause, several potential stop root-causes, i.e. the elementary original stop causes, are identified, always according to the OEE stop causes model of the line. Fig. 4 describes for example the stop rootcauses according to the Idling and minor stops.
Fig. 4. Some elements of the OEE stop causes model of the Bathroom furniture line. All the useful data for computing the OEE measurement are available and provided by the MES of the company. At any moment of the Open time, the OEE measurement is available in terms of time stops, time stop causes and time stop root-causes. The OEE definition and updating is the task of the Engineering_and_Maintenance department, while its improvement is the task of both the Engineering_and_Maintenance department, the Production_department and the Quality_department. The provided results (measurements) as well as the expected ones (objectives) lead each department to choose and implement, individually or collectively, the corresponding corrective and improvement actions. Moreover, twice a year, a review of the historical trend values during the past semester take place with the Head managers in order to verify that the annual objectives are being reached on the one hand, to predict the future results and to determine the new ones, on the other hand.
At the end of 2015, an overall action plan regarding the OEE improvement for the whole company was achieved, allowing a gain of 12.0% from an initial value of “49.5%”. In this context, the OEE current strategy subscribes to the achievement of the annual objective that Head managers, believing in its improvement, declare by the value of “75%”. Such an overall objective is thus monthly deployed (Table 1), with managers expecting a significant increase of the OEE in the first month, in the continuity of what happened during the previous year.
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
OEE objective 64.0% 65.0% 66.0% 67.0% 68.0% 69.0% 70.0% 71.0% 72.0% 73.0% 74.0% 75.0% Table 1. The OEE objective deployment for 2016 Controlling the Bathroom furniture line with the OEE is double challenging, while coherently dealing with daily reactive actions, weekly improvement planned actions and semi-annually checking and reporting the provided results. As a summary, let us retain the following aspects.
Methodologically:
daily, the Engineering_and_Maintenance manager diagnoses, prioritises and reacts to the occurred stop causes;
weekly, Engineering_and_Maintenance department manager, Production_department manager and Quality_department manager discuss for reducing time stops that are related to the detected priority stop cause or root-cause and improving the OEE with regards to the strategic declared objective;
semi-annually, Head managers and the three involved department managers analyse the considered semester trend of the OEE with regards to the expected annual objective, leading them to, on the one hand, to appreciate the obtained benefit and, on the other hand, to make forecasts on what will happen, either if nothing is done or if such an evolution or another occurs.
From the information processing point of view:
Engineering_and_Maintenance department manager deals daily with the numerical instantaneous rootcauses stops measurements for quick and often corrective actions;
Engineering_and_Maintenance
department
manager,
Production_department
manager
and
Quality_department manager analyse what happened during the week by observing the weekly
numerical measurements, respectively in terms of short-term stop root-causes stops reduction and longterm OEE improvement;
Engineering_and_Maintenance
department
manager,
Production_department
manager
and
Quality_department manager report monthly the OEE measurement according to the annual objective.
Enriching the used scorecards at these different levels by providing temporal performance expressions is the object of the next section, by suggesting the use of both numerical crisp format and symbolic fuzzy format for the multiple introduced performance expressions of the OEE. 3
The temporal performance expression model
From a temporal point of view, one can say that managers have then three different practices of the OEE that can be recalled and turned into three questions.
What can be said each day?
What can be said each week?
What can be forecast each semester?
Therefore, taking time into consideration leads to reformulate the questions in more general terms:
At the present time, what can be said?
At the present time, what can be learnt from the past?
At the present time, what can be expected in the future?
The sequel of this paper addresses the questions in terms of performance expressions, leading to distinguish respectively the instantaneous performance expression, the trend performance expression and the predictive performance expression. Before introducing these nuances, let us go back to the performance expression concept.
3.1
General concepts
By assuming that an objective is “an expected value associated with a variable, to be reached at the end of a temporal horizon” (Berrah and Foulloy2013; Foulloy and Berrah2015), three parameters are involved in the performance expression computation:
the variable that handles the physical or the decisional element on which improvement, control and computation is carried out;
the objective that identifies the expected state;
the measurement that identifies the reached state.
The performance expression is, in this context, the general concept which consists of comparing, for a given variable, the measurement and the objective. Three different semantics of the performance expression are moreover distinguished (Berrah et al.2000).
When the control of the achievement of the objective is local or operational, i.e. clear enough for anyone concerned with it, the measurement can be the performance expression itself. The performance expression is the physical measurement.
When the control of the achievement objective is overall, implying multiple managers, avoiding ambiguity and mistaken interpretations leads to explicitly compare the measurement to the objective. The comparison provides a new piece of information which is called the performance measurement. Ratio and distance operators are the functions generally used in this case for making such a comparison, on numerical universes, expressing the performance in physical units or in an absolute manner.
Finally, when a specific semantic is given to the comparison, be it a judgment or a satisfaction, the performance expression is said to be a performance evaluation. In this particular case, two values are highlighted, the one which characterises the total satisfaction with regards to the achievement of the objective, and the other which characterises the total non-achievement of this objective. Such values correspond in particular to the cases of a total achievement and a total non-achievement of the objective respectively, and are generally 1 and 0, Good and Bad, and . The performance expression is then defined on a totally ordered set whose bounds are the previous values, the interval
0,1 , or the set {Bad, Quite-Bad, Medium, Quite-Good, Good} or the set {, , }.
Let us now consider the temporal deployment of the performance expression, leaving aside its physical measurement semantic, and separately dealing with its measurement and its evaluation semantics. For the sake of conciseness, we choose to only develop here the numerical format of such expressions, more details concerning the linguistic format can be found in (Clivillé et al.2014).
3.2
Instantaneous performance expression
The instantaneous performance expression is the computation, at a given time, of the performance expression. Let v be a variable associated with an objective and ti a timestamp with i {1...n} .
oi (v) O is the objective associated with v at ti .
mi (v) M is the measurement associated with v at ti .
iinst (v) iinst (oi (v), mi (v)) F inst is the performance measurement at ti , where inst is the comparison
function from O M to F inst for the instantaneous performance measurement.
piinst (v) f inst (oi (v), mi (v)) P inst is the performance evaluation associated at ti , where f inst is the
comparison function from O M to P inst for the instantaneous performance evaluation.
For the sake of the clarity, the industrial practice that relabels the timestamp ti , using either days, weeks or months, is adopted.
Example 1. By considering the Bathroom furniture line and its OEE, one can see that the retained Captured part root-cause stop has evolved (see Fig. 4), from Monday 03/07/16 to Friday 03/11/16, according to the data given in Fig. 5. 40 35 30 25 20 15 10 5 0 Mon
Tue
Wed
Thu
Fri
Fig. 5. Objectives and measurements of the Captured part stop root-cause from Monday 03/07/16 to Friday 03/11/16.
v Captured part . mTue (Captured part ) 32 min and oTue (Captured part ) 25 min .
Let
us
define
inst
such
( x, y ) O M , inst ( x, y ) x y .
that
Then:
inst (Captured part ) oi (Captured part ) mi (Captured part ) 7 min . Tue
Let us define f inst such that ( x, y ) O M , f inst ( x, y ) 1 if inst i (Captured part ) 5 and 0 otherwise then: inst (Captured part ) 0 . pTue
Let
us
remark
that
the
performance
evaluation
is
Boolean,
the
Engineering_and_Maintenance department manager looking to radically reduce this stop, he judges himself totally not satisfied when this stop exceeds the objective of more than 5 min .
3.3
Variation, trend and trend performance expression
As soon as past values are available, one may compute the variation of these values. Let ti be a timestamp and
k a positive integer. Let ui and ui k respectively be the values of u at ti and ti k . The variation of u at ti , over k , is simply defined by k (ui ) ui ui k . Possibly the variation can be computed after some processing like a linear regression. In this case, the processed value is denoted ui .
Trend is an intuitive concept, it is “a general direction in which something is developing or changing” (Trend2018). When time is considered, trends are one way to express what can be learnt from the past. It is interesting to note that trends can be expressed very roughly, e.g. OEE is increasing. A kind of weighting can also be introduced, e.g. OEE increased a lot or OEE was almost stable. Trends can also be expressed more precisely by the coefficients of linear regressions. The trend of u at ti , over k , will be denoted Trk (ui ) Tr .
The instantaneous trend performance expression, or for short, the trend performance expression, is obtained by comparing the trend of the objective to the trend of the measurement. Thus, the trend performance measurement and the trend performance evaluation are respectively defined in (8):
trend v trend Trk oi (v) , Trk mi (v) i i k 1...n , trend trend Trk oi (v) , Trk mi (v) pi v f
(8)
Example 2. Considering the week 10 objectives and measurements related to the Captured part cause reduction stops (see Fig. 5), a linear regression for both data on five days (Fig. 6) leads to:
Tr4 (oFri (Captured part ))
4 (oFri (Captured part )) 4.5 t Fri tMon
Tr4 (m Fri (Captured part ))
4 (m Fri (Captured part )) 3.8 t Fri tMon 40 35 30 25 20 15 10 5 0 Mon
Tue
Wed
Thu
Fri
Fig. 6. Linear regression used to compute the variation of the Captured part stop root-cause trend.
A negative angle between the regression line associated to the objective and the one associated to the measurement indicates that observed variation is lower than the expected one. In other words, the measurement has not decreased enough. The angle between the two lines is the difference of the arc tangent of their respective slopes. However, since the arctg function is strictly increasing, the sign of the angle is the same as the sign of the difference of the trends. The performance measurement, for a given integer k , is given by the following comparison function: x, y Tr , trend ( x, y ) x y
(9)
It leads to trend Fri (Captured part ) Tr4 (oFri (Captured part )) Tr4 ( mFri (Captured part )) 0.7 . This value
summarises the week trajectories observation. The managers note that the reduction stops related to the Captured part has not evolved as expected.
As seen in section 3.1, the evaluation ranges from 0 to 1, where 1 indicates a total satisfaction of the managers, and in this case, a full achievement of the objective in terms of trends. It is the case when the absolute value of objective trend is greater than the absolute value of the measurement trend. A total non-satisfaction in the case of a total non-achievement of the trend objective corresponds to a measurement trend equal to 0 or having an
opposite sign to the objective trend. Between those two bounds, the evaluation can be computed as the ratio between the trends. In this case, the comparison function is: x, y Tr , trend ( x, y ) max(0, min(1, y / x))
It
gives
(10)
trend (Captured part ) max(0, min(1, Tr4 (m Fri (Captured part )) / Tr4 (oFri (Captured part )))) 0.84 . pFri
Such a value expresses hence a quasi-equality between the objective trend and the measurement trend, leading to a quasi-total satisfaction of the Engineering_and_Maintenance department manager.
Let us note that another possibility is to build the performance evaluation from the performance measurement using a normalisation function g from [0, [ in [0,1] such that g (0) 1 and lim g ( x) 0 . Among many x
functions g satisfying the given properties, the sigmoid function defined by (11):
x [0, [, g x =
1 ec 1 e ( x c )
(11)
This function has the interesting properties of making it possible to tune the sharpness of the function with the parameter around the point x c . In the considered case, the comparison function f trend is defined as: x, y Tr , f trend ( x, y ) g ( trend ( x, y ))
3.4
(12)
Predictive performance expression
As noticed in section 3.3, trends can be understood in the sense of a linear regression just as in the well-known function TREND in Excel6. Applied to time related variables, the linear regression makes it possible to predict the value at a given time from the past values fitted in the least mean square sense. Thus, the idea is to compute, at ti , the expected performance expression at t j with j i based on the past measurement, for example from a linear regression on these values. Let us denote mˆ i , j (v) the predicted measurement value for t j , computed at ti . The predictive performance measurement and the predictive performance evaluation are respectively defined in (10):
6
The TREND formula is directly into Excel's formula bar.
pred ipred o j (v), mˆ i, j (v) , j v i, j k 1...n and j i, pred pred o j (v), mˆ i, j (v) pi , j v f
(13)
Once again, if the predicted measurement is computed from k past values, the first predicted performance expression can be computed at tk 1 . Example 3. By observing the OEE evolution during the four first months of 2016, managers report the following data (Table 2) and begin their analyses. They observe that the January result is not the one expected. ti
t Jan
t Feb
tMar
t Apr
oi (OEE )
64.0%
65.0%
66.0%
67.0%
mi (OEE )
61.8%
62.7%
63.4%
65.0%
Table 2. The monthly OEE objectives and measurement for the beginning of the year 2016
The predicted measurement for t Jun , using a linear regression with k 3 , is mˆ Apr , Jun 67.9 % . Using also the particular familiar performance measurement ratio operator for evaluating the predicted performance: pred p Apr , Jun (OEE )
mˆ Apr , Jun (OEE ) oJun (OEE )
67.9 % 0,98 . Namely, this results means that if the improvements provide 69.0 %
the same quantified results as those obtained during the fourth first months, the semi-annual objective will be quasi totally achieved. 4
Fuzzy approach for visual representation of performance expressions
In the current visual management context, visual representations of information are often associated to rough numbers. It is quite conventional to find coloured emoticons, coloured arrows, sun/cloud, even traffic lights or coloured stickers. Such representation relies on J. Bertin’s visual variables developed in the 1960’s (Bertin2005). The book published by E. Tufte in the 1980’s also contributes to the development of data visualisation (Tufte2001). Building parameterised visual representation is not a new concept. Indeed, in 1973, H. Chernoff has proposed a system for representing points in a multidimensional space by using faces depending on 18 parameters (Chernoff1973). An extension to 36 parameters was then proposed (Flury and Riedwyl1981). The face is split into the right hand side and the left hand side, each side relying on 18 parameters. However, dependencies between parameters makes it quite difficult to use it for the representation of performance
evaluations, which are mainly monodimensional pieces of information. In their recent paper concerning the development of dashboards for SME, Vilarihno et al. focused on the dashboard layout but very little is said about the visual features: “information presented through trend charts and others graphics; quality tools; tables; images; white space; among others” (Vilarinho et al.2018). Besides the conventional visual items used in dashboards, companies are also using other visual expressions such as emoticons or suns/clouds, widely used in emails or instant messaging, to share information at-a-glance. Surprisingly, “there are no economic studies exploring the impact of non verbal cues, despite their prominence in communication” (Brook and Servátka2016) and, to the best of our knowledge, these visual performance expressions are limited to their basic display, including the use of the Wingdings font for emoticons. In their paper Schultz et al. illustrate the use of emoticons to send households an “injunctive message conveying that their energy consumption was either approved or disapproved” (Schultz et al.2007). In the home energy reports, linguistic terms “Great”, “Good”, “Below average” are associated with emoticons. For example, the smiling face emoticon is doubled for the term “Great” (Allcott2011). In the same field of energy saving, Chiang et al. have investigated the influence of simple coloured emoticons, as the ones used to try to slow down speeding drivers, for energy display interfaces. They showed that the interaction, evaluated as the response time, is better with coloured emoticons than with back-onwhite ones (Chiang et al.2012). In the same spirit, performance evaluation should be strongly related to the manager satisfaction or judgement. It should represent to which extent he is satisfied by the observed state with respect to the expected one, defined by the objective. Indeed, the performance evaluation represents to which degree the manager satisfaction is, for example, “high”, “medium” or “low”. In other words, it characterises the grade of membership to the classes of respectively high, medium or low satisfactions. The classes are understood with gradual bounds, and hence the satisfaction changes progressively from low to medium and from medium to high. Thus, the performance evaluation is a question of membership degree to a class. The fuzzy subsets theory is a convenient approach to address such a question. Its related concepts and the difference with regards to the probability theory has been widely explained since L. Zadeh’seminal paper in 1965 (Zadeh1965; Zadeh1973; Zadeh1978; Dubois et al.2000).
The aim of this section is to introduce a quite general method to build visual representation for performance expression, by handling the considered control on the one hand, and the performance expression semantic on the other hand. From a general point of view, a visual representation can be defined as the result of a mapping
visual (type, ) where type is the type of the representation, e.g. arrow, emoticon… and is a vector of
parameters associated with the given type. For example, a coloured arrow has a colour, a size, an angle… Thus, the visual representation of a performance expression is, for a given type, the transformation of the performance expression, i.e. either the physical measurement, the performance measurement or the performance evaluation into the vector for this type. Let us explain this point on the visual representation of a performance evaluation expressed in [0,1] by an emoticon whose colour and smile depends on the performance evaluation expression. Assume this figure has two parameters colour and smile representing respectively the colour, defined for example as a vector in the RGB space, and the smile, defined for example in [1,1] where -1 gives a grimace and 1 a smile. Then, the performance evaluation has to be transformed by two mappings, the first one from [0,1] to the RGB space and the second one from [0,1] to [1,1] . Assume that x is a numerical value to be converted into the k th parameter , denoted k , of the representation. The method consists of using an intermediate space where it is simple, for a human being, to defined prototype parameters of the representation, e.g. the value of for which the emoticon should be fully green and smiling. Thus, a fuzzy numeric-to-linguistic transformation is first performed, then follows a fuzzy linguistic-to-numeric transformation defined on the parameter spaces (Clivillé et al.2014).
The fuzzy numeric-to-linguistic transformation, also called fuzzification, is related to the concept of descriptor set (Zadeh1971) introduced from the relation between numbers and linguistic terms. Let X be a set of numbers and L a set of linguistic terms. The meaning of a term l L , denoted M (l ) , is defined by its membership function M (l ) from L to F ( X ) , where F ( X ) is the set of all fuzzy subsets on X . Thus, M (l ) ( x) is the grade of membership of x to the meaning of l . Conversely, the fuzzy description of a number x , denoted D( x) , is defined by its membership function from X to F ( L) , where F ( L) is the set of all fuzzy subsets of L . Thus,
D ( x ) (l ) represents to which extent the number x can be described by the linguistic term l . Because the fuzzy meaning and the fuzzy description are two different ways of representing the same relation between numbers and linguistic terms, the following equation holds: l L, x X , D ( x ) (l ) M ( l ) ( x)
(14)
The linguistic-to-numeric transformation, also called defuzzification, relies on more adhoc approaches. One possibility is a weighted mean. Let x [0,1] a performance expression to be represented visually and k the k th
parameter of the visual representation. Let l be the grades of membership of D( x) . For the sake of the simplicity, it is assumed that
l
1 . It means that the membership functions used for the fuzzification define
l
a Ruspini’s fuzzy partition (Ruspini1969). Let yk ,l be the k th prototype parameter for the linguistic term l , i.e. the numerical value that k should take when l 1 (under the fuzzy partition assumption j 0 for j l ). Then, denoting k for the k th defuzzification, the parameter k is given by (15):
k k ( D( x))
.y l
k ,l
l
l
l . yk , l
(15)
l
l
If the membership functions are triangular or trapezoidal, this method provides a simple means to build a multilinear transformation of the space of the performance expression to the space of visual parameters.
Example 4. Let
x [0,1] be the performance evaluation to be represented by an emoticon and
L {Low, Medium, High} a set of linguistic terms whose fuzzy meanings are represented in Fig. 7.
1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Fig. 7. Fuzzy meaning of the linguistic terms Low, Medium and High. Using Zadeh’s representation of discrete fuzzy, when x 0.85 the fuzzification is the linguistic fuzzy subset given by D(0.85) 0 / Low 0.75 / Medium 0.25 / High .
Considering the coloured emoticon with its two parameters colour and smile and the fuzzy partition represented in Fig. 7, the visual representation of a performance evaluation of 0.85 is given in Fig. 8, based on the following prototype parameters.
ycolour , High
0 1 1 1 , ycolour , Medium 1 , ycolour , Low 0 and ysmile, High 1 , ysmile , Medium 0 , ysmile , Low 1 0 0 0
Fig. 8. Coloured emoticon associated with a performance evaluation of 0.85.
Let us now go back the Fournier Company OEE practice and show how the formal developments proposed in Sections 3 and 4 are applied to enrich the scorecards of managers. 5 5.1
Temporal performance expressions for the OEE control Daily performance expressions
The daily control of the OEE consists of reducing the different causes and root-causes stops (see Section 2.2). According to the monthly updated root-causes stops (see Fig. 4) Pareto diagram of the Bathroom furniture line, the reduction of one or a few priority root-causes are retained for being continuously reduced “as much as possible”. From Fig. 9 that shows the March reference Pareto diagram, the Engineering_and_Maintenance department manager retains the Captured part root-cause stop to be reduced.
Fig. 9. Extract of the OEE march 2016 Pareto diagram.
At this short-term level, the objectives are daily declared based on the result from the corrective actions of the previous day. For example, the objective for Tuesday 03/08/16 was oTue (Captured part ) 25 min . At the end of this day, the measurement was mTue (Captured part ) 32 min (see Example 1 and Fig. 5). The instantaneous inst performance measurement was Tue (Captured part ) 7 min from which the Engineering_and_Maintenance
department manager decided to keep the same objective for Wednesday, i.e. oWed (Captured part ) 25 min . Beyond the provided performance measurements, Boolean performance evaluations are added, according to the threshold of 5 min . Such expressions are visually represented by means of sun/cloud images (Fig. 10), in order to provide a quick understanding about the success of the corrective actions. Thus, a sun means that the result of the corrective actions are considered as successful while a cloud means the contrary.
Fig. 10. The Tuesday 03/08/2016 and Wednesday 03/09/2016 instantaneous performance measurements and evaluations.
5.2
Weekly performance expressions
Once a week, the three managers have a meeting (see Section 2.2). The goal of such as meeting is double. They check, on the one hand, that daily corrective actions are going in the right direction. They assess, on the other hand, the planned improvement actions, those which are associated with the OEE annual objective.
Thus, on Friday 03/11/16 afternoon, the three managers observed the scorecard related to the Captured part rootcause stop (Fig. 11). Such a scorecard contains the objectives and the measurements (upper graph). For each day, the instantaneous performance measurements and the Boolean performance evaluations represented as suns and clouds are provided (middle part). The trend performance measurement is the last information (lower part) provided both in the numerical form, as seen in Example 2 (see Section 3.3), and with a coloured red/green arrow, always for quick understanding.
40
20
0 Mon
Tue
Wed
Thu
Fri
Fig. 11. The scorecard for the Captured part from Monday 03/07/3016 to Friday 03/11/2016.
As it can be seen, even if the corrective actions applied Monday, Wednesday and Thursday were evaluated as successful, which is not the case for Tuesday and Friday, the measurement values did not decrease enough over the week since the sign of the trend performance measurement is still negative.
Besides, on Friday 03/11/16 afternoon, the three managers also discussed the improvement action plan. To do this, the company uses the “Obeya chart” (Sasha and Siavash2015). This type of chart is a collaborative tool that
is shared by the different actors concerned by the considered action plan. It is defined to gather all the main useful data for the action plan control of the long-term objective.
For the Bathroom furniture line and its OEE annual objective, actions are planned by trimester, detailed for the current month and updated each week of this month. The elementary action is the weekly one, being visualised by a post-it with a brief description. The colour of each post-it identifies the action holder. Each post-it is placed in a cell identified by a column which corresponds to a period and a line which corresponds to a department as shown in the Obeya chart model of Fig. 12. A picture from the Fournier company of such an Obeya chart for the Bathroom furniture line in March 2016 is shown in Fig. 13.
Fig. 12. Model of Obeya chart for the period beginning the 5th week.
Fig. 13. Picture of the Bathroom furniture line Obeya chart for the period beginning the 5th week (in French). On this day, the managers wished to have an idea about the OEE performance expression. During the week, numerous stops occurred, leading them to become worried about the PI evolution, even if considering such a PI on such a period led to questioning about the accuracy of the interpretations. By considering the temporal deployment of the OEE objective (see Table 1), they interpolated that oW 10 (OEE ) 65.4% . They computed pWinst10 (OEE ) 0,96 whose coloured emoticon is shown in Fig. 14. Should the stop causes impact have been
minor?
Fig. 14. Instantaneous performance evaluations of the OEE for Week 10, 2016.
5.3
Semi-annual performance expressions
At the end of 2015, the measurement for the OEE was equal to “61.5%” when the objective was “64.0%” for January 2016 (see Table 1 and Section 2.2). The scorecard shown in Fig. 15 was prepared for the semi-annual meeting to be held on 06/30/2016. The upper subplot recalls the monthly objective deployment for the year, the obtained measurements value until June 2016 and the linear regression for the measurement values, based on the six last months. Both instantaneous and predictive performance evaluations were displayed and represented by means of coloured emoticons.
In the spirit of what has been done in Example 3 (see Section 3.4), the linear regression is used to compute the prediction of the measurement in December on the basis of what is known in June. It leads to mˆ Jun , Dec (OEE ) 74.2 % . The predictive performance evaluation for t Dec is computed at t Jun , always using the
ratio as f
pred
leading to: pred pJun , Dec OEE
mˆ Jun , Dec (OEE ) oDec (OEE )
0.99
80
70
60
50 Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec 2016
Fig. 15. Scorecard for the semi-annual meeting on June 30, 2016.
Of course, it is only a prediction, meaning that if the execution of the action plan remains the same, the instantaneous performance at the end of 2016 should be very satisfactory, in other words, the objective
oDec (OEE ) 75 % should be reached. However, even if this study is essentially focused on methodology and formalisation, economic benefits have also been obtained. In addition to the well known benefits of the instantaneous OEE performance knowledge, having an OEE trend and prediction for the following months allows the three managers to better plan the capacity. In order to adjust the capacity, they now level the load using the Master Plan Schedule or modify the work time in accordance with the Human Resources practice of the company. Thus they obtain the right capacity at the lower cost rather than choosing in real time a variable such as the use of interim workers or overtime, which are costly. In addition, the Fournier Company have reduced the delivery delays avoiding the ensuing delay penalties and additional delivery fees. So knowing the three types of OEE performance should improve the company profitability. 6 Discussion and conclusion Planning is not reacting. Observing is not checking. Improving can be more than correcting. Even if all these actions subscribe to the control and the Deming wheel improvement philosophy, each of them handles specific decision, temporal horizon and information. Achieving the objectives and providing the right information at the right moment in the right format have been the keywords of this study.
Dealing with the PI’s and PMS’s frameworks in the visual management and advanced technologies context, the developed model has been built according to two pillars. The first pillar is related to the three temporal dimensions of the performance expression that are the instantaneous expression, the trend expression and the predictive expression. The second pillar concerns the three semantics of the performance expression that are the physical measurement, the performance measurement and the performance evaluation. According to the MES’s data and the fuzzy formalism, scorecards that associate a priori as well as a posteriori information, numerical and symbolic information are proposed, going in the direction of current industrial practice, and enriching it by the elicitation of the links between what happened, how it happened and what can be expected.
The OEE improvement has been the core of this work. The OEE being at the same time a short-term operational PI as well as a mid-term and a long-term strategic PI, its specificities indeed lead to apply the proposed development. The Fournier Company has been the partner of this study. The proposed ideas have been mostly validated. Further works are in progress concerning formal as well as methodological points, by highlighting:
the daily objective declaration that is related to yesterday’s measurement;
the Obeya chart exploitation, and the impact analysis of the short-term reactive actions on the planned improvement actions;
the simultaneous handling of all the root-cause stops;
the distinction between the objectives, even if they are declared for “correcting” or for “improving”;
the consideration of a continuous trajectory rather than a linear discrete deployment of the annual objective;
the use of the universal ratio operator for expressing the OEE performance;
the correspondence between the symbolic visual management principle expressions and the performance expression semantics and the nature of the executed actions;
the integration of the model into the company’s MES.
and lastly, the enlargement of this way of expressing the OEE performance expressions to the other PI’s of the line, leading to a joint analysis of them and to an enhancement of the synergies between the actions associated with them.
More generally, the temporal performance model provides a kind of toolbox for expressing the performance. Managers may choose either instantaneous, trend or predictive performance expression to deal respectively with the present, the past and the future. They may also consider performance measurement just as performance evaluation. Furthermore, numerical, linguistic or symbolic information can be taken into consideration and processed. The axes obtained from Fournier Company experimentation have thus to be taken into consideration, as well as for the company and from a theoretical point of view. In particular, the temporal aggregation issue will need to be studied.
7 References AFNOR (2002). NF E60-182 : Moyens de production - Indicateurs de performances - Taux de rendement synthétique (TRS) - Taux de rendement global (TRG) - Taux de rendement économique (TRE). Ahuja, I. (2009). Total Productive Maintenance. Springer, London. Allcott, H. (2011). Social norms and energy conservation. Journal of Public Economics, 95(9):1082 – 1095. Special Issue: The Role of Firms in Tax Systems. Alsyouf, I. (2007). The role of maintenance in improving companies’ productivity and profitability. International Journal of Production Economics, 105(1):70 – 78. Anthony, R. N. (1965). Planning and control systems: a framework for analysis. Boston, Graduate School of
Business Administration, Harvard University. Badiger, A. S. and Gandhinathan, R. (2008). A proposal: evaluation of OEE and impact of six big losses on equipment earning capacity. International Journal of Process Management and Benchmarking, 2(3):234– 248. Berrah, L. (2002). L’indicateur de performance : concepts et applications. Cépaduès. Berrah, L. and Foulloy, L. (2013). Towards a unified descriptive framework for industrial objective declaration and performance measurement. Computers in Industry, 64(6):650–662. Berrah, L., Mauris, G., Haurat, A., and Foulloy, L. (2000). Global vision and performance indicators for an industrial improvement approach. Computers in Industry, 43(3):211–225. Bertin, J. (2005). Sémiologie graphique - Les diagrammes, les réseaux, les cartes. Ecole Des Hautes Etudes En Sciences Sociales - collection Les Réimpressions. Bititci, U. S., Suwignjo, P., and Carrie, A. S. (2001). Strategy management through quantitative modelling of performance measurement systems. International Journal of Production Economics, 69(1):15–22. Bitton, M. (1990). Ecograi : méthode de conception et d’implantation de systèmes de mesure de performances pour organisations industrielles. Thèse de doctorat en automatique, Université de Bordeaux I. Blokdyk, G. (2017). Manufacturing execution system: A Clear and Concise Reference. CreateSpace Independent Publishing Platform. Brook, R. and Servátka, M. (2016). The anticipatory effect of nonverbal communication. Economics Letters, 144:45 – 48. Browne, J., Devlin, J., Rolstadas, A., and Andersen, B. (1997). Performance measurement: the ENAPS approach. International Journal of Business Transformation, 69(2):73–84. Chernoff, H. (1973). The use of faces to represent points in k-dimensional space graphically. Journal of the American Statistical Association, 68(342):361–368. Chiang, T., Natarajan, S., and Walker, I. (2012). A laboratory test of the efficacy of energy display interface design. Energy and Buildings, 55:471 – 480. Cool Roofs, Cool Pavements, Cool Cities, and Cool World. Clivillé, V., Berrah, L., and Foulloy, L. (2014). Fuzzy symbolic handling of industrial instantaneous and trend performance expressions. In Grabot, B., Vallespir, B., Gomes, S., Bouras, A., and Kiritsis, D., editors, IFIP International Conference on Advances in Production Management Systems (APMS2014), volume 440, pages 68–75, Ajaccio, France. Springer Berlin Heidelberg. Clivillé, V., Berrah, L., and Mauris, G. (2007). Quantitative expression and aggregation of performance
measurements based on the MACBETH multi-criteria method. International Journal of Production Economics, 105(1):171–189. Cosmetatos, G. P. and Eilon, S. (1983). Effects of productivity definition and measurement on performance evaluation. European Journal of Operational Research, 4(1):31–35. Cross, K. and Lynch, R. (1988). The "SMART" way to define and sustain success. National Productivity Review, 1:23–33. da Silva, A. F., Marins, F. A. S., Tamura, P. M., and Dias, E. X. (2017). Bi-objective multiple criteria data envelopment analysis combined with the overall equipment effectiveness: An application in an automotive company. Journal of Cleaner Production, 157:278–288. Dal, B., Tugwell, P., and Greatbanks, R. (2000). Overall equipment effectiveness as a measure of operational improvement – a practical analysis. International Journal of Operations & Production Management, 20(12):1488–1502. Diakoulaki, D., Mavrotas, G., and Papayannakis, L. (1992). A multicriteria approach for evaluating the performance of industrial firms. Omega, 20(4):467–474. Dubois, D., Nguyen, H. T., and Prade, H. (2000). Possibility theory, probability and fuzzy sets: misunderstandings, bridges and gaps. . In Dubois, D. and Prade, H., editors, Fundamentals of Fuzzy Sets , The Handbooks of Fuzzy Sets Series, pages 343–438. Kluwer, Boston, Mass. Ducq, Y., Vallespir, B., and Doumeingts, G. (2001). Coherence analysis methods for production systems by performance aggregation. International Journal of Production Economics, 1:23–37. D’Antonio, G., Sauza Bedolla, J., and Chiabert, P. (2017). A novel methodology to integrate manufacturing execution systems with the lean manufacturing approach. Procedia Manufacturing, 11:2243–2251. 27th International Conference on Flexible Automation and Intelligent Manufacturing, FAIM2017, 27-30 June 2017, Modena, Italy. Flury, B. and Riedwyl, H. (1981). Graphical representation of multivariate data by means of asymmetrical faces. Journal of the American Statistical Association, 76(376):757–765. Fortuin, L. (1988). Performance indicators, why, where and how? European Journal of Operational Research, 34:1–9. Foulloy, L. and Berrah, L. (2015). A fuzzy handling of trend objective declaration and trend performance expression. In IFAC Symposium on Information Control in Manufacturing (INCOM2015), Ottawa, Canada. Ghalayini, A. M., Noble, J. S., and Crowe, T. J. (1997). An integrated dynamic performance measurement
system for improving manufacturing competitiveness. International Journal of Production Economics, 48(3):207–225. Hedman, R., Subramaniyan, M., and Almström, P. (2016). Analysis of critical factors for automatic measurement of OEE. Procedia CIRP, 57:128–133. Factories of the Future in the digital environment Proceedings of the 49th CIRP Conference on Manufacturing Systems. Hwang, G., Lee, J., Park, J., and Chang, T.-W. (2017). Developing performance measurement system for internet of things and smart factory environment. International Journal of Production Research, 55(9):2590– 2602. Imai, M. (1986). Kaizen: The Key to Japan’s Competitive Success. McGraw Hill Higher Education. ISO9000 (2000). Quality management systems – fundamentals and vocabulary. ISO 9000:2000, International Standards Organization. www.iso.org. Jelali, M. (2013). Control Performance Management in Industrial Automation: Assessment, Diagnosis and Improvement of Control Loop Performance. Springer. Johnson, H. T. (1975). Management accounting in early integrated industry - e.i. dupont de nemours powder company 1903-1912. Business History Review, pages 184–204. Jonsson, P. and Lesshammar, M. (1999). Evaluation and improvement of manufacturing performance measurement systems - the role of oee. International Journal of Operations & Production Management, 19(1):55–78. Kanigel, R. (2005). The One Best Way: Frederick Winslow Taylor And The Enigma Of Efficiency. MIT Press; New edition. Kaplan, R. and Norton, D. (1992). The balanced scorecard: Measures that drive performances. Harvard business Review, 70(1):71–79. Kurscheidt Netto, R. J., Santos, E. A. P., de Freitas Rocha Loures, E., and Pierezan, R. (2017). Using overall equipment effectiveness (OEE) to predict shutdown maintenance. In Amorim, M., Ferreira, C., Vieira Junior, M., and Prado, C., editors, Engineering Systems and Networks, pages 13–21, Cham. Springer International Publishing. Lycke, L. and Akersten, P. A. (2000). Experiences of implementing TPM in Swedish industries. International Journal of Reliability and Application, 1(1):1–14. Maran, M., Manikandan, G., and Thiagarajan, K. (2013). Overall Equipment Effectiveness Measurement: Weighted Approach Method and Fuzzy Expert System, pages 231–245. Springer Netherlands, Dordrecht.
Mari, L. (2003). Epistemology of measurement. Measurement, 34(1):17–30. McCarthy, D. and Rich, N. (2004). Lean TPM: A Blueprint for Change. Butterworth-Heinemann. Melnyk, S. A., Stewart, D. M., and Swink, M. (2004). Metrics and performance measurement in operations management: dealing with the metrics maze. Journal of Operations Management, 22(3):209–218. Micheli, P. and Mari, L. (2014). The theory and practice of performance measurement. Management Accounting Research, 25(2):147–156. Muchiri, P. and Pintelon, L. (2008). Performance measurement using overall equipment effectiveness (OEE): literature review and practical application discussion. International Journal of Production Research, 46(13):3517–3535. Mâinea, M., Dutã, L., Patic, P. C., and Cãciulã, I. (2010). A method to optimize the overall equipment effectiveness. IFAC Proceedings Volumes, 43(17):237–241. 5th IFAC Conference on Management and Control of Production Logistics. Nakajima, S. (1988). Introduction to TPM: Total Productive Maintenance. Preventative Maintenance Series. Productivity Press. Neely, A., Gregory, M., and Platts, K. (1995). Performance measurement system design: a literature review and research agenda. International Journal of Operations and Production Management, 48(4):80–116. Noble, J. S. and Lahay, C. W. (1994). Cycle time modeling for process improvement teams. In Proceedings of the 3rd Industrial Engineering Research Conference, pages 372–377, Atlanta, GA. Nudurupati, S. S., Bititci, U. S., Kumar, V., and Chan, F. T. S. (2011). State of the art literature review on performance measurement. Computers & Industrial Engineering, 60(2):279–290. Ohno, T. (1988). Toyota Production System: Beyond Large-scale Production. Production Press. Relkar, A. S. and Nandurkar, K. N. (2012). Optimizing & analysing overall equipment effectiveness (OEE) through design of experiments (doe). Procedia Engineering, 38:2973–2980. Rodrigues, M. and Hatakeyama, K. (2006). Analysis of the fall of TPM in companies. Journal of Materials Processing Technology, 179(1):276–279. 3rd Brazilian Congress on Manufacturing Engineering. Ruspini, E. (1969). A new approach to clustering. Information and Control, 15(1):22–32. Saha, D., Syamsunder, M., and Chakraborty, S. (2016). Manufacturing Performance Management using SAP OEE. Implementing and Configuring Overall Equipment Effectiveness. Apress. Sasha, S. and Siavash, J. (2015). Supporting Production System Development Through Obeya Concept. Lambert Academic Publishing.
Schlick, C. and Demissie, B. (2016). Evaluation of Complexity in Product Development, pages 159–214. Springer International Publishing, Cham. Schultz, P. W., Nolan, J. M., Cialdini, R. B., Goldstein, N. J., and Griskevicius, V. (2007). The constructive, destructive, and reconstructive power of social norms. Psychological Science, 18(5):429–434. PMID: 17576283. Shen, C.-C. (2015). Discussion on key successful factors of TPM in enterprises. Journal of Applied Research and Technology, 13(3):425–427. Shirose, K. (1997). TPM : total productive maintenance : new implementation program in fabrication and assembly industries. Japan Institute of Plant Maintenance, Tokyo, Japan. Singh, R., Shah, D. B., Gohil, A. M., and Shah, M. H. (2013). Overall equipment effectiveness (OEE) calculation - automation through hardware & software development. Procedia Engineering, 51:579–584. Sonmez, V., Testik, M. C., and Testik, O. M. (2018). Overall equipment effectiveness when production speeds and stoppage durations are uncertain. The International Journal of Advanced Manufacturing Technology, 95(1):121–130. Stadnicka, D. and Antosz, K. (2018). Overall equipment effectiveness: Analysis of different ways of calculations and improvements. In Hamrol, A., Ciszak, O., Legutko, S., and Jurczyk, M., editors, Advances in Manufacturing, pages 45–55, Cham. Springer International Publishing. Steenkamp, L. P., Hagedorn-Hansen, D., and Oosthuizen, G. A. (2017). Visual management system to manage manufacturing resources. Procedia Manufacturing, 8:455–462. Steinbacher, H. R. and Steinbacher, N. L. (1993). TPM for America. Productivity Press, Portland, Oregon. Tajiri, M. and Goto, F. (1992). TPM implementation, a Japanese approach. McGraw-Hill. Taylor, F. (1911). Principles of Scientific Management. Harper Bros. Trend (2018). https://en.oxforddictionaries.com/definition/trend. Tufte, E. (2001). The Visual Display of Quantitative Information (2nd Edition). Vernadat, F. (1996). Enterprise Modeling and Integration. Springer. Vilarinho, S., Lopes, I., and Sousa, S. (2018). Developing dashboards for smes to improve performance of productive equipment and processes. Journal of Industrial Information Integration. Zadeh, L. (1965). Fuzzy sets. Information and Control, 8:38–353. Zadeh, L. (1971). Quantitative fuzzy semantics. Information Sciences, 3:159–176. Zadeh, L. (1973). Outline of a new approach to the analysis of complex systems and decision processes. IEEE
Transactions on Systems, Man, and Cybernetics, SMC-3:28–44. Zadeh, L. (1978). Fuzzy sets as a basis for a theory of possibility. Fuzzy Sets and Systems, 1:3–28.