CIRP Annals - Manufacturing Technology 60 (2011) 503–506
Contents lists available at ScienceDirect
CIRP Annals - Manufacturing Technology jou rnal homep age : ht t p: // ees .e lse vi er. com/ci rp/ def a ult . asp
Computational mechanics approach to managing complexity in manufacturing systems R. Vrabicˇ, P. Butala (2)* Department of Control and Manufacturing Systems, University of Ljubljana, Slovenia
A R T I C L E I N F O
A B S T R A C T
Keywords: Manufacturing system Management Complexity
Complexity has been identified as a ubiquitous and ever increasing property of manufacturing systems. Conventional theories of management lack the tools to describe, analyse, and manage complexity, and can, in turn, no longer cope with the issues it gives rise to. New approaches are offered by complexity science, namely by computational mechanics. In the paper, a method for complexity assessment is proposed and illustrated on real industrial data. The results of the presented case study suggest a distinct relationship between complexity and throughput, and indicate that the tool used has a major impact on complexity. ß 2011 CIRP.
1. Introduction Modern manufacturing systems are widely perceived as complex; in fact, due to the globalisation of world markets, their complexity is continuously increasing [1]. Today, the global competition takes place in an uncertain environment where manufacturing organisations are subjected to interdependence risks and hard-to-determine causal relations. This influences all levels of organisations, all the way down to their shop floors. Managers are faced with issues presented by the increasing complexity. Their decision making, however, remains predominantly based on the rule of thumb because of the incompleteness of information, limited knowledge, and insufficient understanding of complexity. Recently, though, complexity science has emerged as an interdisciplinary field that attempts to provide a better understanding of complex systems. One of the branches of complexity science is computational mechanics, which is a method for discovering and representing predictive patterns in time series [2]. The aim of the paper is to contribute to a better understanding of complexity in manufacturing systems through the application of computational mechanics. A method for assessing complexity in manufacturing systems is proposed. The method is illustrated on a case of serial production in an automotive supplier company. 2. Quantitative approaches to manufacturing complexity The principal motivation for the studies of complexity in manufacturing is the search for methods that would allow us to reduce the degree of complexity and make the manufacturing system more productive and predictive [3]. It has been hypothe-
* Corresponding author. 0007-8506/$ – see front matter ß 2011 CIRP. doi:10.1016/j.cirp.2011.03.050
sised that a better understanding of complexity can be achieved through development of quantitative metrics [1]. Two fundamental types of complexity have been identified, structural and operational (sometimes called dynamic) [4]. The former is concerned with the complexity of structures, e.g. with the arrangements of machine tools, types of products, organisational structures, etc. The latter is temporal, i.e. it focuses on phenomena that can be seen as complex only when observed in time. Past research had resulted in a number of quantitative approaches ranging from ad hoc methods and methods based on information theory to, recently, nonlinear dynamics. The majority of complexity definitions in manufacturing stems from the hypothesis that complexity equals uncertainty. The uncertainty approaches are based on Shannon entropy [5]. Several entropic definitions have been proposed for structural complexity [6–9] as well as operational complexity [8,10]. However, entropic metrics have been criticised for always involving a certain degree of subjectivity [11]. Their value is tied to the level of detail and the limits of the boundary of the system under investigation [10]. This is partially resolved by choosing a common level of description which then allows for comparisons, but there are no proposed methods to choose this level. There are also other approaches towards a quantitative definition of complexity in manufacturing, such as nonlinear dynamics [11], but these are still in the comparatively early stages of development. Complexity science offers a new approach to the complexity of manufacturing systems. As an interdisciplinary science it facilitates a transfer of knowledge gained by the studies of other systems to the manufacturing domain. In this sense, computational mechanics, which was initially motivated by physics, provides an opportunity to apply insights of complexity science to the manufacturing domain. The main difference in comparison to other approaches is that complexity is associated with prediction.
R. Vrabicˇ, P. Butala / CIRP Annals - Manufacturing Technology 60 (2011) 503–506
504
3. Computational mechanics Computational mechanics addresses the issues of pattern, structure, and organisation. It provides an information-theoretic method for finding optimal causal models of stochastic processes. In essence, it shows – from either empirical data or a probabilistic description of behaviour – how to infer a model of the hidden process that generated the observed behaviour [2]. The approach is based on the concept of causal states. Its basics are introduced here according to Ref. [2].
equal to the amount of information that the past provides about the future. Like all quantities in information theory, statistical complexity depends on the level of observation: coarse-graining in both structural and temporal scales. In computational mechanics, a metric is proposed that helps guide the selection of the appropriate observation level. This metric is termed ‘efficiency of prediction’ (e) and is calculated as a ratio between excess entropy and statistical complexity. e¼
3.1. Causal states and e-machines Computational mechanics is concerned with symbolic dynamics – signals of discrete symbols assigned to discrete time $ steps. Consider a symbolic sequence S ¼ S1 S0 S1 S2 consisting of random variables Si, where each Si may take a symbol si drawn from an alphabet, a finite countable set A of size k. At any time t, the sequence can be divided into a past S and a future ~ S. A causal state is then defined as a set of pasts that have the same distribution of conditional probabilities for all possible futures. Formally, causal states are members of the range of the function e, which maps from histories to sets of histories: ~
~
~
~
~
~
~
~
~
eðsÞ ¼ fs 0 jPð~ S ¼~ sjS ¼ sÞ ¼ Pð~ S ¼~ sjS ¼ s 0 Þ for all ~ s2~ S; s 0 2 Sg
(1) ~
Each causal state Si is defined by its index i, a set of pasts fs 2 Si g; and a conditional distribution over futures Pð~ Sjs Þ; s 2 Si : For example, imagine a deterministic process that generates alternating zeroes and ones: . . .010101. . . For this process, all pasts ending with a 0 have the same distribution of futures. The same holds true for all pasts ending with a 1. The process therefore has two causal states, the first one containing pasts 0, 10, 010, . . ., and the second one 1, 01, 101, . . . Causal states are connected through transitions, which specify probabilities of generating a symbol si when making the transition between two states. In the example process, the first causal state produces a 1 with the probability of P(1) = 1, and the second causal state produces a 0, also with the probability of P(0) = 1. Causal states are maximally accurate predictors and their stateto-state transitions are minimally stochastic. Together with the transitions, they form ‘e-machines’ which represent a computational model underlying the given symbolic sequence. e-machines are unique and maximally efficient models [2]. ~ ~
3.2. Statistical complexity and efficiency of prediction Statistical complexity Cm is defined as the Shannon entropy over the distribution of causal states. P(Si) denotes the probability of the e-machine being in state Si. X PðSi Þ log2 PðSi Þ (2) Cm ¼ Si 2 S
Statistical complexity Cm is the average amount of historical memory stored in the process, in the units of bits. In a complex process, more information about the past is stored internally. Prediction therefore requires more information and is, in turn, more difficult. Statistical complexity takes low values for sequences generated by constant and purely random processes. It is argued that it, in doing so, coincides with the intuitive notion of complexity [12]. An interesting property of statistical complexity is that it is an upper bound of excess entropy E, which is the mutual information of the process’ past and future.
E Cm
(4)
It can be interpreted as the fraction of historical memory stored in the process which is useful in telling us about the future. Given two possible observation levels, the one with the higher efficiency of prediction should be chosen. The computational mechanics approach has a number of features that make it suitable for complexity analysis of manufacturing systems. It is prediction-centred, which is itself essential for decision making. It can be used without an underlying model, which is beneficial, as accurate models of complex systems, especially ones where human subjects are involved, are hard to create. On a final note, computational mechanics deals with signals in time and is thus concerned with operational complexity, although results from a simulation scenario have shown that structural complexity can sometimes be inferred from the operational one [13]. To apply computational mechanics to manufacturing systems, a method is needed. A complexity assessment method is presented in the following section. 4. Complexity assessment method The presented method shows how statistical complexity can be calculated from the observation of manufacturing processes. As statistical complexity measures the amount of pattern in a time series, this means that the approach is limited to what is observed. Causal relations which lead to the observations are described only implicitly through the patterns the relations produce. The method encompasses data acquisition, warehousing, symbolisation, and analysis. Data acquisition and warehousing are shown in Fig. 1. Inputs and outputs of the process are observed (SX and SY) in an event-based manner. The data is collected and stored for operational management and control. The stored data can be used for further analyses. Database records must include timestamps of the events to allow for the subsequent generation of symbolic sequences. In general, the database stores information about inputs and outputs. The process states, such as operation state and process parameters, can manifest as either, depending on the individual parameter. Either inputs or outputs can be used as the basis for complexity analysis, depending on the interests of the analyst. The stored data is used to reconstruct e-machines and calculate statistical complexity (Eq. (2)). This is done through data symbolisation and analysis of the generated symbolic sequence as presented in Fig. 2. Two parameters must be determined for data symbolisation, a time interval Dt and an encoding function.
[()TD$FIG]
~
E ¼ IðS; ~ SÞ C m
(3)
This allows for another interpretation of statistical complexity. The memory needed to perform an optimal prediction is greater or
Fig. 1. Data acquisition and warehousing.
[()TD$FIG]
R. Vrabicˇ, P. Butala / CIRP Annals - Manufacturing Technology 60 (2011) 503–506
The time interval determines the temporal level of observation. The number of events during the interval is the basis for the encoding. The selection of a suitable Dt is not trivial, as different levels of observation yield different symbolic sequences which, in turn, yield different results complexity-wise. To allow for an objective selection of Dt, efficiency of prediction must therefore be calculated as per Eq. (4). This can only be done after the symbolic sequence is produced, in the analysis step. Symbolisation and analysis form a feedback loop, where the appropriate time interval is determined by analysis and then used for symbolisation in an iterative fashion, as shown in Fig. 2. During the analysis, the symbolic sequence is transformed into an e-machine that describes it. e-machines are generated using the CSSR algorithm [14], which takes two parameters, maximum history length Lmax, and significance level of a test a, which determines whether new causal states must be generated. The parameters are discussed in [14]. Statistical complexity and efficiency of prediction are then calculated. Data from the database is used to produce metadata that describes the e-machine and the symbolic sequence. This is important for the interpretation of the results. To provide new insight into the nature of production and to contribute to the better understanding of complexity, the results must be interpreted by an expert who knows the analyzed manufacturing system.
505
Fig. 3. Relation between complexity and throughput (a) and (b); e-machine for an example lot (c).
5. Case study The method is applied in a case study of real industrial data. The production is serial, running 24 h a day in three shifts. The data for five work centres (WC) was acquired over a period of 24 months, from October 2007 to October 2009. Four of the centres are identical, while the fifth is slightly different. Nevertheless, all the centres are interchangeable in regard to the components they can manufacture. During the period, 350,000 pieces were produced on average per work centre. Production made use of a total of 75 different tools. The acquired data contains information about each process cycle. The data includes the WC, the tool, the time of production, and several other process parameters. The data is analyzed to assess the complexity of the outputs. The inputs, which correspond to work orders, are assumed to mandate a continuous production of outputs within a batch. The assumption is the main reason for the chosen encoding which is constituted as follows. In all cases the chosen time interval Dt is 240 s, and Lmax for the CSSR algorithm is set to 6. This was suggested by an analysis of the efficiency of prediction for different time intervals. The number of produced pieces during the time interval (between 0 and 5) is the basis for the encoding. Let us look into some characteristic results of the complexity assessment. The results in Fig. 3 show the dependence between complexity Cm and throughput in relation to WC, tool, year of production, and lot. Here and throughout, the term ‘lot’ is used to
[()TD$FIG]
Fig. 2. Symbolisation and analysis steps of the method.
describe a period of continuous work without interruption for the duration of a work shift (8 h) and within the same batch (same tool). This is because complexity drops to 0 bits when nothing is being produced. Graphs are shown for two cases where a tool was used with different WCs at different times. Each marker represents a lot, different marker shapes represent different work centres, and different colours represent different years (see Fig. 3). The calculated complexity is concentrated around 1 bit, meaning only the simplest of patterns is discovered: whether the work centre is producing or not producing. The throughput (number of pieces produced within a time interval) is completely stochastic. Fig. 3c) shows a typical e-machine for this scenario. State 0 corresponds to no production while state 1 corresponds to production. Complexity takes values higher than 1 bit where other patterns are discovered. For example, complexity of 1.5 bit typically means that there are three distinctive states: not producing, producing in small quantities, and producing in large quantities. Still higher complexities are usually obtained when there are several periods of time alternating between production and no production. This is where complex patterns appear. The graphs also show a distinct relation between complexity and throughput. Complexity is 0 bits when nothing is being produced and 0 bits when something is continuously being produced at the maximum possible rate as shown by the dashed arcs in Fig. 3(a) and (b). The colours in Fig. 3(a) show that complexity is 1 bit for the vast majority of lots from year 2009. As shown by a preliminary analysis of the data [13], this may be the result of a significant decline in orders, which is itself a consequence of the global recession. A comparison of graphs in Fig. 3(a) and (b) shows that there is a significant difference in the share of highly complex lots. On the other hand, there are no significant differences between the symbols within each graph. This means that the tool used influences complexity while the work centre does not. Fig. 4 shows complexity of outputs of WC 1 as a function of time, starting from the beginning of data acquisition for the work centre. The complexity is calculated for a time window of 256 sequential time intervals. The figure clearly shows that complexity levels are influenced by the tool used, being 1 bit for tools A, E, and F, 1.5 bit for tool C, and higher for tools B and D.
[()TD$FIG]
R. Vrabicˇ, P. Butala / CIRP Annals - Manufacturing Technology 60 (2011) 503–506
506
The presented method is used to assess operational complexity–complexity of temporal behaviour of the observed process. As a further research step, an analysis of the connection between structural and operational complexity needs to be performed, focusing on the influence of tool geometry and process management on complexity. Acknowledgements This work was partially supported by the Ministry of Higher Education, Science and Technology of the Republic of Slovenia, Grant No. 1000-08-310127, and by the Slovenian Research Agency, Grant No. P2-0270. References Fig. 4. Relation between complexity and time for WC 1.
6. Conclusion and discussion The computational mechanics approach towards complexity in manufacturing systems is based on the hypothesis that the harder a process is to predict, the more complex it is, which sets the approach apart from other, entropic approaches. The developed method for complexity assessment in manufacturing systems not only quantifies complexity in a meaningful way, but also elaborates why a process is complex through analysis of the generated e-machine. The results of the case study show that the prevalent factor in operational complexity is the tool used. This implies that the tool geometry and the consequent process knowledge and management are important factors and should be targeted in order to reduce complexity. As seen from the relation between the complexity and the throughput, complexity should be lowered, but not in a way that would decrease the throughput. An important result is that complexity of 1 bit or less can be achieved, signifying that the best predictive description in this type of production is statistical. This can increase the management’s confidence in statistical forecasts. In contrast to the tool, the work centre was not found to be a factor in this case. As the work centres were identical, this implies that the effect of different operators is best described statistically and thus has no direct influence on complexity.
[1] Wiendahl H, Scholtissek P (1994) Management and Control of Complexity in Manufacturing. CIRP Annals – Manufacturing Technology 43(2):533–540. [2] Shalizi CR, Crutchfield JP (2001) Computational Mechanics: Pattern and Prediction, Structure and Simplicity. Journal of Statistical Physics 104(3–4): 817–879. [3] Hon K (2005) Performance and Evaluation of Manufacturing Systems. CIRP Annals – Manufacturing Technology 54(2):139–154. [4] Peklenik J (1995) Complexity in Manufacturing Systems. CIRP Journal of Manufacturing Systems 24(1):17–25. [5] Shannon CE, Weaver W (1949) The Mathematical Theory of Communication. University of Illinois Press. [6] ElMaraghy W, Urbanic R (2003) Modelling of Manufacturing Systems Complexity. CIRP Annals – Manufacturing Technology 52(1):363–366. [7] Deshmukh AV, Talavage JJ, Barash MM (1998) Complexity in Manufacturing Systems, Part 1: Analysis of Static Complexity. IIE Transactions 30(7):645–655. [8] Frizelle G, Woodcock E (1995) Measuring Complexity as an Aid to Developing Operational Strategy. International Journal of Operations & Production Management 15(5):26–39. [9] Suh NP (2005) Complexity in Engineering. CIRP Annals – Manufacturing Technology 54(2):46–63. [10] Sivadasan S, Efstathiou J, Calinescu A, Huatuco LH (2006) Advances on Measuring the Operational Complexity of Supplier–Customer Systems. European Journal of Operational Research 171(1):208–226. [11] Papakostas N, Efthymiou K, Mourtzis D, Chryssolouris G (2009) Modelling the Complexity of Manufacturing Systems Using Nonlinear Dynamics Approaches. CIRP Annals – Manufacturing Technology 58(1):437–440. [12] Prokopenko M, Boschetti F, Ryan AJ (2009) An Information-theoretic Primer on Complexity, Self-organization, and Emergence. Complexity 15(1):11–28. [13] Vrabicˇ R, Butala P, Assessing Operational Complexity of Manufacturing Systems Based on Statistical Complexity, International Journal of Production Research, in press, doi:10.1080/00207543.2011.575098. [14] Shalizi CR, Shalizi KL (2004) Blind Construction of Optimal Nonlinear Recursive Predictors for Discrete Sequences. Proceedings of the 20th Conference on Uncertainty in Artificial Intelligence UAI 2004, 504–511.