ARTICLE IN PRESS Int. J. Production Economics 122 (2009) 403–418
Contents lists available at ScienceDirect
Int. J. Production Economics journal homepage: www.elsevier.com/locate/ijpe
Taking operations strategy into practice: Developing a process for defining priorities and performance measures Edson Pinheiro de Lima , Se´rgio Eduardo Gouveˆa da Costa, Avides Reis de Faria ´, Imaculada Conceic- ˜ Industrial and Systems Engineering Graduate Program, Pontifical Catholic University of Parana ao Street 1155, 80215-901 Curitiba, Brazil
a r t i c l e in fo
abstract
Available online 21 June 2009
Enterprises’ operations systems and environments, characterized by their complexity and dynamics, are challenging operations strategic management models. The study presented in this paper develops a process to integrate operations strategy content to operations performance measurement system design. Essentially, the developed methodology is based on Process Approach (Cambridge Approach) technique that systematizes procedures for generating a performance measures set coherent to operations strategy objectives and also produces a consistent strategy implementation process. To illustrate the development and the application of the proposed design methodology, findings of two case studies related to telecom engineering services companies are used. Results are discussed focusing on testing the proposed methodology in terms of its feasibility, usability, and utility. A refined process, organized in phases, steps, and procedures, is the final result of the presented study. & 2009 Elsevier B.V. All rights reserved.
Keywords: Strategic management Operations strategy Action plan Performance measurement
1. Introduction Enterprises’ operations systems and environments, characterized by their complexity and dynamics, are challenging operations strategic management models. New operations systems design requirements are compelling companies to engage in a broad and in-depth change process. The operations system (re)design covers organizational and management processes; particularly, organizations are paying closer attention to the changing nature of operations systems performance. Actually, in operations strategic management systems context, performance measurement subsystems, processes, and measures used to assess enterprises performance are the main focus of (re)design projects (Brown and Fai, 2006; Neely et al., 2005; Gomes et al., 2004; Munive-Hernandez et al., 2004; Kaplan and Norton, 1992).
Corresponding author. Tel.: +55 41 32711333; fax: +55 41 32711345.
E-mail address:
[email protected] (E. Pinheiro de Lima). 0925-5273/$ - see front matter & 2009 Elsevier B.V. All rights reserved. doi:10.1016/j.ijpe.2009.06.022
Enterprises are promoting several changes in their business systems and processes in order to develop a more integrated and responsive operation (Henry, 2006; Chen, 2005). Some of the redesign initiatives are being conducted in the strategic domain, oriented to develop a strategic fit between operations strategy, represented by their decision areas and performance dimensions, and production planning systems (Dı´az Garrido et al., 2007; Olhager and Selldin, 2007; Olhager and Rudberg, 2002). These initiatives also deal with the alignment between operations, manufacturing, or service strategies and competitive strategies, using business performance as its measure (Amoako-Gyampah and Acquaah, 2008; Brown et al., 2007; Acur and Bititci, 2004; Melnyk et al., 2004; Pun, 2004; Joshi et al., 2003). These strategic design initiatives are the main context of the presented paper and guide the whole research design. The main objective of this paper is related to operations strategic management system design, and contributes to a better understanding of operations strategy design, implementation, use, and review. Particularly, it shows how to generate action plans based on
ARTICLE IN PRESS 404
E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
operations strategy specifications, and building on these plans, to design a performance measures set. This paper shows a design methodology development process whose main purpose is to create an operational procedure to review the operations strategic management system. The main orientation of this research is based on Neely et al. (2005) propositions related to individual, systemic, and environmental aspects. When associated to individual performance measures it could address the question ‘How can one ensure that the management loop is closed—that corrective action follows measurement?’ When approaching the performance measurement system as an entity, it contributes to the understanding of what are the ‘definitive’ principles of performance measurement system design; and to identify what techniques managers can use to reduce their list of ‘possible’ measures to a meaningful set. Studying issues associated with the system and its environment brings in questions like ‘Why do firms fail to integrate their performance measures into their strategic control systems?’ and ‘How can we ensure that the performance measurement system matches the firm’s strategy and culture?’ It is assumed that the strategic management system, in its performance measurement aspect, is conceived to deploy enterprise strategic performance management instead of performance measurement by itself; develop dynamic rather than static strategic management systems; enhance performance measurement systems flexibility, and improve their capability to cope with organizational changes (Neely, 2005). It is important to point out that the performance measurement system is an ‘amalgam’ that integrates the strategic management system, and guarantees the development of its continuous improvement and learning processes (Neely, 2005; Folan and Browne, 2005). The strategic management system organizes its main function through a set of subsystems: strategic and planning subsystems and a performance measurement subsystem. This system is the object of analysis; the design process, and the approach used to conceive, to manage, and to operationalize its design task is based on the Process (Cambridge) Approach developed by Platts (1993). The design process is presented in its entire set of phases and activities and information from two telecom engineering services companies’ case studies is used to illustrate its application. The main focus of this paper is the study of action plans formulation and their required performance measures. 2. Operations strategic management system This section shows the concepts in building the operations strategic management system—OSMS—theoretical framework. This theoretical construction is used as a structural framework that guides the procedural framework development (Folan and Browne, 2005). 2.1. Strategic management systems The management logic of a strategic control system was developed in early times, when performance mea-
surement systems were introduced. The measurement system is a part of a wider system, which includes goals setting, feedback, and reward functions (Neely et al., 2005). It is important to formally declare some theoretical assumptions that support the operations strategic management system design:
Mintzberg (1978) arguments that only through a
consistent pattern of actions, a strategy could be identified. In fact, the strategy exists only if it is realized. It is assumed that there is an interplay between actions’ results and their consistency that is established over time; the performance measurement system could mediate that interaction. According to Neely et al. (2005), performance measurement is the process of quantifying the efficiency and effectiveness of action. A performance measurement system is the set of metrics used to quantify both efficiency and effectiveness of actions. Central to these definitions is that action leads to performance and that there are internal and external factors that affect the efficiency and effectiveness of this relationship. Performance measurement systems should be designed, implemented, and managed as part of a strategic management system. The measures should be derived from strategy and should provide consistency for decision making and action. Particularly, the operations function will be managed in terms of its own strategic management system (Dı´az Garrido et al., 2007; Neely et al., 2005; Olhager and Rudberg, 2002; Skinner, 1969). Strategic management control systems should be used as a means to provide surveillance, motivation, monitoring performance, stimulating learning, sending signals, anticipating events, introducing constraints, and managing scenarios to the operations system. It is important to realize that the control function is being defined by exploring complementary features of mechanic and organic behaviour, i.e. not only reacting and tracking strategy, but also reviewing system design (Yeung et al., 2007; Henry, 2006; Neely et al., 2005). Performance measurement systems should be able to manage the determinants and results of operations systems outputs, exploring the causalities between them and developing a predictive approach for the whole operations strategic management system (Tan and Platts, 2007; Lu and Botha, 2006; Tan and Platts, 2005; Kaplan and Norton, 1992; Fitzgerald et al., 1991; Keegan et al., 1989).
The present reality that organizations and their managers are facing in their operations strategic management systems, are forcing them to review their assumptions on how to manage operations system performance. An integrated approach based on computational systems—the integrated enterprise—associated with the proliferation of total quality management and lean manufacturing practices are creating real conditions for integrated performance measurement system implementation (Gomes et al., 2004).
ARTICLE IN PRESS E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
A framework to organize the identified features and functions that an operations strategic management system should develop is proposed.
The performance measurement model proposed by
2.2. Structural theoretical framework One of the best known performance measurement frameworks is Kaplan and Norton’s (1992) ‘balanced scorecard’. The balanced scorecard provides in the same system, a planning technique and also a performance measurement framework. It could be classified as a strategic management framework, as it integrates strategic maps process to performance dimensions. The main role of this strategic management system is to create value that is perceived by customers, through the improvement and development of business processes. The balanced scorecard is based on ‘innovation action research’ and through this approach develops a methodology that integrates design, implementation, and operation of a strategic management system (Kaplan, 1998). Based on performance measurement frameworks evolution, particularly those founded on a strategic management logic, an expansion of the balanced approach in the direction of a total integrated one can be traced. There are some evidences of this evolution or co-evolution process when the following frameworks are analyzed together:
The performance measurement matrix integrates dif-
ferent dimensions of performance, employing the generic terms: ‘internal’, ‘external’, ‘cost’, and ‘noncost’. It could be noted that the matrix enhances the perspective to external factors (Keegan et al., 1989). The strategic measurement, analysis, and reporting technique—SMART—developed by Lynch and Cross (1991). They proposed a performance pyramid, which uses a hierarchic structure to represent integration between organizational vision and operations actions. There is an interplay between external and internal orientations to improve internal efficiency and external efficacy. Performance is qualified in terms of internal and external aspects.
Inputs
Operations Strategy formulation process
Actions planning process
405
Fitzgerald et al. (1991) integrates determinants and results of operations systems performance, exploring causalities between them. Measures can be related to results (competitive position, financial performance), or they are focused on determinants results (cost, quality, time, flexibility, and innovation). There is a distinction between ‘means’ and ‘ends’. The integrated dynamic performance measurement system—IDPMS—conceived by Ghalayini et al. (1997) incorporates, to performance measurement systems, dynamic features and integrative properties. The integrative process involves the management function, process improvement teams, and factory shop floor activities. The management system creates a dynamic behaviour that articulates its specification and a reporting process. Dynamics features are also presented in Neely’s et al. (2002) performance prism. They develop a scorecardbased system for measuring and managing stakeholder relationships. The framework is conceived to cover strategies, processes, capabilities, stakeholder satisfaction, and stakeholder contribution aspects. The main objective of the strategic management system is to deliver stakeholder value.
Based on the strategic features identified in the studied performance management frameworks and incorporating Mills et al. (2002) strategic logic, the framework presented in Fig. 1 is proposed. Using the structural framework assumptions, for defining variables, causal links and domain, a process for its implementation—the operations strategic management design process—can be developed.
3. The implementation process The design methodology is conceived using the Process Approach technique. Initially, a design rationality that found the applied methodological approach is presented and some premises for Process Approach development are defined. In fact, a procedural framework is developed.
Actions implementation process
Performance measurement system
Feedback control loop Revision of the operations strategy, action planning and implementation processes Fig. 1. Operations strategic management framework.
Outputs
ARTICLE IN PRESS 406
E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
PROBLEM IDENTIFICATION
PROPOSED FRAMEWORKS
THEORETICAL INVESTIGATION
METHODS OF APPLICATION
EMPIRICAL INVESTIGATION Fig. 2. Evolutionary life cycle process.
3.1. Defining the methodological approach This section presents the approach used to study the process rationality underlying the OSMS design. Frohlich and Dixon (2001) comment that Operations Management (OM) field, particularly in strategic related themes, has brought forward new ideas; however it has been less effective in validating concepts after their introduction. Hence, the underlying OSMS process design rationality must be related to its knowledge life cycle and this could employ propositions in three different perspectives, that is: system design, implementation processes, and defining the role that findings play relating theory and practice. The first perspective to be studied is related to the following question: ‘How does the OM field build and refresh its knowledge basis?’ To address this question, some rationalities used in the Operations Management field are presented for producing knowledge that is consolidated in theories, models, frameworks, and processes. For this purpose, theoretical constructions developed by Neely (2005) and Slack et al. (2004) are used to illustrate the knowledge developing cycle process rationality. Slack et al. (2004) question if OM research should in fact produce new ideas. They propose that the OM orientation should be to continually look for a reconciliation point between research and practice. They acknowledge that this is not a trivial task, but it is meaningful that OM’s principal academic role is to ‘conceptualise’ practice and ‘operationalize’ theory. Therefore, OM would be better recognized not as a ‘normal’ functional management discipline but rather as a knowledge broker in the whole knowledge-producing process (Nonaka and Takeuchi, 1995). OM methods would provide an important contribution to enterprises’ operational and strategic activities improvement. This assumes that the proposed research is positioning its main contribution in terms of research and practice reconciliation. The results or ‘design solutions’ contribute to the development and test of practical solutions for operations strategic management system design, implementation, and management. It will be appropriate for the purposes of this paper to take the theoretical construction of Neely (2005), shown in Fig. 2, to be used as a meta-framework. The presented discussion could be positioned in the evolutionary life cycle process.
In the early stages of Performance Management— PM—as a discipline, effort was put on identifying the main problems to be studied and solved, followed by a structuring activity based on theoretical frameworks proposition, which organized and addressed PM knowledge body to solve the identified problems. Based on the proposed frameworks, processes were developed to test them and to verify their robustness and correctness through empirical investigation. This interplay between analysis and synthesis allowed an evolution and consolidation of the PM discipline theoretical body. The cycle developed by Neely (2005) identifies a specific context that could be used to explain the approach adopted in this study, for producing and testing operations strategic management systems models and design methodologies. The main logic that governs OSMS can be also explained by the design and engineering of a general management system, as presented in Fig. 3 (Sousa and Groesbeck, 2004). The OSMS (re)design process should be linked to real operations systems demands and theoretical constructions that were formulated based on previous work and experiences related to continuous OM field knowledge production flow. Therefore, it should be recognized that the OM field is in a continuous, complex, and dynamic evolution. Operations managers and professionals, ‘practitioners’, are facing in their day-to-day decision process situations that are questioning their mental models, characterizing events that are continually restarting the redesign process (Slack et al., 2004; Zilbovicius, 1997). The second perspective employed in the study explains how practical issues could be addressed, in designing, implementing, and managing OSMS. The Process Approach (Cambridge Approach) is used to build all the implementing activities, integrating in a participative way, design and management processes (Platts et al., 1996; Platts, 1994, 1993). The approach, developed by Platts (1993), aims to develop a prescriptive procedure, operationalizing a set of concepts through a structured and participative process, supported by data collection instruments, a project-based management structure and evaluation criteria. The Process Approach use entails various advantages for OSMS development. The methods and techniques are characterized by the use of worksheets that organize and document the process; workshops that constitutes the ‘locus’ for validating the filled worksheets and for communicating results to the various participants;
ARTICLE IN PRESS E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
IMPLEMENTATION (realization)
ANALYSIS Business Mission
Design solutions
407
Business ‘needs’ Opportunities and challenges
DESIGN (synthesis) Fig. 3. System design and engineering logic.
Table 1 Main characteristics of the process approach. Procedure
Participation
Project management
Point of entry
The process is properly defined in terms of organization and operational procedures Phases
Individual and team-based activities interrelates all the involved actors The participative characteristics increases Enthusiasm
It is important to check if all the required resources are addressed and available It is important to define
It is important to clearly define the scope, content, and pretended results of the project The start and development of the project should have the acknowledgement and concordance of the coordinator group
Information searching and scanning Information analysis Change and/or improvement opportunities identification The applied techniques and tools should be simple enough to attend the requirements of the operational processes. Their use must be easily understood
The results of each phase of the project should be documented and reported
Comprehension Involvement The participation ‘spaces’ could be run through workshop to
Achieve the concordance around the objectives of the project Identify and to formally declare the main problems Propose and develop improvement actions Create a locus for involvement and participation The participative process creates a decision-making forum that guides the actions
A coordinator group A support group The operational or executive group The project planning and chronogram should be produced by a participative and consensual process
It is a necessary condition for the project starting activities that the groups are fully involved and identified with their roles The coordinator group, especially their leader must receive all the required support from the involved actors
Source: adapted from Platts (1994).
and the facilitator role defined for coordinating and conducting the whole process, aiming to assure the execution of all the proposed steps. Table 1 synthesises the main characteristics of the Process Approach implementation (Platts, 1994). The design process underpinning rationality addresses implementation and managing processes, creating conditions for a double loop learning process development. Slack (2000) identifies three main phases in the process of redesigning a manufacturing system: structuring, suppositional, and assimilation activities. The structuring activity is used to construct, in social terms, design objectives and common sense of options. Design options
are defined in terms of performance trade-offs, constrained by the systems’ strategic context. The suppositional activity extends the common language developed to approach performance issues in the structuring activity, to a scenario-creating process for design choices. This phase stimulates the debate about resource and capabilities needed and design process trade-offs. The externalization process developed in the suppositional activity creates the right conditions for identifying knowledge gaps. At this point, an assimilation activity runs as a result of a learning process, which emerges in the suppositional phase and gets consolidated in the assimilation phase by knowledge gaps identification. The three interrelated activities play a
ARTICLE IN PRESS 408
E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
STRATEGIC AND COMPETITIVE CONTEXT
PROCESS DESIGN OPTIONS
IDENTIFICATION OF TRADE-OFFS
STRUCTURING ACTIVITY
TRADE-OFFS CHARACTERISTICS
PROCESS KNOWLEDGE
PROCESS CONTROL
KNOWLEDGE
SYSTEM
GAPS
FLEXIBILITY
RESOURCE CAPABILITIES
SUPPOSITIONAL ACTIVITY
ASSIMILATION ACTIVITY
Fig. 4. Model of the underlying design activity.
special role in integrating design, implementation, and management of an operations strategic management system. Fig. 4 shows the interrelated design activities proposed by Slack (2000). They follow a similar interactive knowledge creation process as proposed by Nonaka and Takeuchi (1995). The structuring phase socializes and externalizes knowledge, the suppositional activity combines knowledge, and the assimilation phase internalizes the produced knowledge. The knowledge creation importance in producing sustainable and reinforced learning processes is noteworthy. A key research objective is to conceive a methodology for designing operations strategy management systems. The proposed design rationality could follow Slack’s (2000) framework as an initial prescription and then employ management and implementation procedures using the Process Approach technique. The presented frameworks were selected to provide some specific features for operations strategic management systems design, implementation, and use processes, which can be summarized as:
The system structurally establishes organizational
learning capability as an important outcome of the design (Slack, 2000), implementation (Platts, 1993), and management processes. It develops an understanding of companies’ operations processes dynamics, helping firms to develop a strategic vision based on dynamic capabilities (Slack, 2000; Teece et al., 1997). The learning processes and the enhancing knowledge basis may lead to a perception improvement of having
the strategic management system under control. This confidence may in turn reinforce a continuous and virtuous cycle of learning and improvement (Slack, 2000). The third perspective that guides the discussion presented in this paper is defined by theoretical assumptions. These assumptions could act as design recommendations informing the theoretical development and delimiting their scope as a strategic management system (Henry, 2006; Folan and Browne, 2005). The three perspectives are related to strategic management system design, implemented at the operations function level. The design approach is based on practice versus theory reconciliation logic (Slack et al., 2004), using a process that continuously interplays empirical and theoretical assumptions (Neely, 2005). The practical application is set by operational and management processes developed by Slack (2000) and Platts (1993), respectively. Once the methodological approach for OSMS design is presented, a process for its implementation can be developed. This process is worked out in the second and third perspectives as it develops procedures using the Process Approach technique and observes some specific design recommendations. 3.2. The implementation process The Process Approach is essentially founded on an action research technique. It organizes the action research implementation and structure knowledge creation for action.
ARTICLE IN PRESS E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
409
Table 2 Process phases. Phase
Objective
WSH
WS
1
Service groups or families are organized and named. Market standards are declared as reference for the identified service groups. The present service families’ performance is evaluated. Service groups are analyzed through competitive criteria, using market references in order to identify the main problems. Problems are declared as Competitive Criteria GAPS An exercise to identify opportunities and threats is conducted. The strategic business objectives are related to service groups. Strategy is studied in order to establish a common comprehension of the strategic context. The running actions of each service family are identified and their contributions to the competitive criteria are evaluated, using the categories proposed by decision areas framework. GAPS could be clearly defined based on omissions or low levels of contribution The identified GAPS in Phases 1 and 2 are used to guide new actions formulation process. The proposed new actions are studied in terms of coherence and trade-offs. The new actions are evaluated according to present actions, analyzing the whole set consistency and coherence, using simultaneously competitive criteria and decision area framework Performance indicators are reviewed to follow new actions development. Actions are detailed. Performance measures are (re)designed
WSH1
WS 1–WS 5
WSH2
WS 6–WS 10
WSH3
WS 11–WS 13
WSH4
WS 14–WS 16
2
3
4
Source: authors.
PHASE 2 (WSH-2)
PHASE 1 (WSH-1) WS-1 Operations Function Analysis
WS-2 Problem identification: market standards
WS-3 Problem identification: operations performance
WS-6 Opportunities x Threats Analysis
WS-7 Ordination and Priorities Objectives
WS-8 Open Questions Strategy
WS-9 Decision Areas x Competitive Criteria (Running practices, actions, routines and procedures) WS-4 Problems x Competitive Criteria
WS-5 GAPS (Competitive Criteria)
PHASE 3 (WSH-3)
WS-11 NEW ACTIONS Proposition Focused on Competitive Criteria and Decision Areas GAPS
WS-12 NEW ACTIONS Matrix exclusion analysis of trade-offs
PHASE 4 (WSH-4)
WS-10 GAPS (Decision Areas) (Low scores, absence of practices, actions routines or procedures)
WS-13 Decision Areas x Competitive Criteria Consistency and Coherence analysis (running actions x new actions)
WS-14 Validated NEW ACTIONS x Operational Performance Indicators
WS-15 NEW ACTIONS operational planning (detailed description)
WS-16 Performance measurement system redesign (revision and update)
NEW TASK FOR OPERATIONS FUNCTION
PMS IMPROVEMENT Fig. 5. Logic structure of the proposed process.
Coughlan and Coghlan (2002) argue that the intended results of action research projects are not only related to the solutions developed to the addressed research problem, but also to the learning
process that is established when developing solutions. The processes for reviewing operations strategy formulation and their related performance indicators are
ARTICLE IN PRESS 410
E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
organized in four main phases and these phases are sequenced in specific steps. Most of the developed procedures that are practical instruments for steps implementation are designed as worksheets—WSs. Sixteen worksheets are proposed and tested. Data and information consolidation for each of the four phases are obtained through workshops—WSHs. Table 2 describes the proposed phases that in fact, correspond to a specific workshop that is supported by worksheets. The process is designed based on Process Approach technique recommendations (Platts, 1994). The process implementation logical structure can be visualized in Fig. 5. It is important to observe that the whole process logic is developed based on external and internal analysis, externally founded on opportunities and threats identification and competitive objectives. Internally, it is defined by decision areas that indirectly represent operations systems’ strengths and weaknesses. The developed analysis that generates information for gaps identification is based on priorities and strategic choices study (Pun, 2005; Silveira, 2005). The participative process is obtained through workshops that produce a synthesis for each one of the proposed phases, consolidating and validating the results. The Process Approach participation essence feature, the mobilization provided by the action research strategy, and the socialization and externalization knowledge processes are clearly developed in workshops activities as shown in Table 3 (Coughlan and Coghlan, 2002; Nonaka and Takeuchi, 1995; Platts, 1993; Argyris, 1993). In practical terms, people involved in the research group (coordination group) fulfil individually the required forms for discussing results in workshops. A facilitator animates the workshops in order to socialize studied company processes and activities knowledge; externalize required worksheets information and statements consensus; combine the results from previous phases guaranteeing consistency along the process; and generate collective plans for the developed concepts, a basis for learning and internalization. After each of the four workshops, procedures applied in Phases 1–4, are evaluated in terms of feasibility, utility, and usability, providing an assessment of the proposed methodology.
It is important to comment that the point of entry is defined by a project agenda construction, defining the involved actors. Formally, the senior executives declare support to initiative in this phase. The presented elements are a brief resume of the whole procedures. Some specific steps will be shown in detail as the results of the case studies are presented.
4. Process testing The following illustrative results are obtained from two case studies. The case studies were developed in telecommunication industry companies, related to engineering-based services. The first case study—XCom company—was developed as a pilot case study to test operational procedures. A few modifications were introduced and the results of the first case were used as part of the refinement process. XCom company services are based on telecom laboratory tests and hardware maintenance. It has approximately 100 employees and a net income of US$ 3 millions per year. The case study took 50 h, which was organized in 10 meetings, with an average duration of 5 h, involving 12 people in the research/coordination group. Professionals from marketing, finance, product development, infrastructure, OM, quality management, and business strategy functions were involved in the process development. The second case study—YCom company—contributed effectively to the refinement process. YCom company services are concerned with telecom commissioned installation service and infrastructure development. It has more than 100 employees and a net income of US$ 4 millions per year. The case study took 36 h organized in nine meetings, taking approximately 4 h for their development, and involving seven people in the research/ coordination group. Professionals from finance, infrastructure, OM, transmission services, quality management, and business strategy functions were involved in the process application. Both companies are certified by ISO 9001: 2000, and they are also reviewing and formalizing their strategic deployment process. The processes developed by the companies covered all the four proposed workshops.
Table 3 Workshops organization. Workshop objective
Suggested participation
The workshop developed in Phase 1 aims to develop a strategic role for operations function, based on a market-based approach. Performance gaps are discussed and declared, using competitive criteria as the main reference The operations strategy is studied in terms of its contents and their capability to realize competitive strategy. Implementation strategy gaps are identified, using decision areas as the main reference The identified gaps, obtained through the competitive criteria and decision areas models, found the new actions formulation process. The proposed actions are tested, refined, and validated The refined set of new actions orients the strategic operations planning development and the performance measurement system review. The final result is the (re)design of the performance measurement system
CEO, COO Operations management managers and front/operational levels) CEO, COO Operations management managers and front/operational levels) CEO, COO Operations management managers and front/operational levels) Operations management managers and front/operational levels)
Source: authors.
Workshop—WSH WSH1 (medium WSH2 (medium WSH3 (medium (medium
WSH4
ARTICLE IN PRESS E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
Some results are shown in the following steps and they were selected from both cases. It is not intended to show the whole research in detail, but to illustrate the main results and characteristics of the tested process. 4.1. Phase 1—competitive gaps Worksheet WS 4 was very useful to consolidate and to represent the findings of Phase 1, because it clearly defines the competitive gaps. Fig. 6 illustrates the application for XCom and YCom cases, whose service (product) families are a hardware maintenance service and a telecom commissioned installation service, respectively.
411
The developed operational procedures are adapted from the processes developed by Platts et al. (1998). The gaps are defined and declared using Worksheet WS 5 procedures. Table 4 illustrates these procedures using data from Worksheet WS 4 applied to XCom and YCom cases. The concurrent gap analysis, based on external and internal perspectives, generates statements that play roles as design recommendations (Folan and Browne, 2005). The main result of Phase 1 is a set of statements that defines competitive criteria gaps. Phase 2 develops a similar discussion with an internal focus, particularly defined by decision areas. (a) XCom
-2
-1
0
1
2
G1
Delivery speed
low
Service environment / Client interface
low customer interaction
high customer contact
Consistency
low variety
high variety
Competence
low skills
Flexibility
low levels
high levels
Credibility / Safe image
high risk level
low risk level
Price/Cost
cost competition
Tangibles
low visibility
high visibility
Access
low quality contact interfaces
high quality contact interfaces
h igh
G3
high expertise
G2
differentiation competition
(b) YCom -2
-1
0
1
2
Delivery speed
low
Service environment / Client interface
low customer interaction
Consistency
low variety
high variety
Competence
low skills
high expertise
Flexibility
low levels
high levels
Credibility / Safe image
high risk level
low risk level
Price/Cost
cost competition
Tangibles
low visibility
Access
low quality contact interfaces
h igh
G2
high customer contact
G3
G1 G4
differentiation competition high visibility high quality contact interfaces Present performance Market requirement
Fig. 6. WS 4: The competitive criteria gaps.
ARTICLE IN PRESS 412
E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
Table 4 WS 5: competitive gaps statements. WS 5—competitive gaps statements Competitive criteria (a) Xcom Delivery speed
WS 4 identified gaps description
Gap G1 Field work urgency High-speed demands Spare parts availability
Competence
Gap G3 Few strategic partners—low levels of relationships competences Inadequate infrastructure (equipments and expertise) Demands for training in the new technologies Nonexistence of technological monitoring and benchmarking
Credibility/safe image
Gap G2 High internal costs Proposed service prices are not in accordance with the market standards Low bargain power with the clients Low productivity levels
(b) YCom Consistency
Gap G2 Customer services processes improvement required (companies and individual customers) Delivery time improvement required Field equipment installation quality improvement required (attend specification)
Service environment/client interface
Gap G3 Customer synergy improvement required Lack of project team integration (external and employees) Customer service feedback improvement required Customer satisfaction surveys must be optimized
Cost
Gap G1 Tools and equipments use and maintenance Vehicles use and maintenance
Tangibles
Gap G4 Low profit margins Unstable service demand and facilities utilization rates High taxes Supplies prices
Source: authors.
4.2. Phase 2—decision areas gaps In Phase 2 the operations strategy implementation is studied. Table 5 shows the gaps identification procedure that builds ‘decision areas’ and ‘competitive criteria’ relationship analysis. Procedures were developed taking into account the instruments developed by Platts et al. (1998) and Platts and Gregory (1990). The complete list created on Worksheet WS 9 supports gaps statements formulation. Table 6 shows examples for XCom and YCom cases. At this point, a resource-based focus applied for operations strategy formulation can be observed, as capabilities and competency gaps are identified in the decision areas (Gro¨ssler, 2007; Tranfield et al., 2004). These statements can be used as capabilities required for operations vision (adapted from Maslen and Platts, 2000 ‘manufacturing’ vision concept). Phases 1 and 2 together form the new actions proposition base. Low scores or at present actions omission are the main criteria that move Phase 3. An intensive OM team involvement to create a rich set of new propositions is expected.
Phase 3 will provide a new operations strategy action plan.
4.3. Phase 3—new actions proposition Table 7 shows the proposed new actions set that could be used as inputs for the design recommendations generating process. Proposed actions are tested to verify their internal coherence and consistency. The developed procedure is a relationship matrix. The relationship matrix is an adapted correlation matrix for qualitative analysis. An impact analysis is developed and the proposed actions are ranked based on an importanceperformance strategic criteria (Greasley, 2004; Slack and Lewis, 2002; Crowe and Rolfes, 1998). The cross-analysis involving proposed actions is based on professionals and experts panel developed in Workshop WSH3. A consensual process is developed for the matrix generation and the final actions selections are those that effectively guide the redesign process (Faisal et al., 2006; Sage, 1977).
ARTICLE IN PRESS E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
413
Table 5 WS 9: decision areas gaps identification. WS 9—decision areas competitive criteria Decision areas
(a) XCom 11—Customer management
12—Performance measurement (b) YCom 1—Service design
6—Organization
Running actions
Consistency Competence Delivery speed
Service Flexibility Credibility/ Tangibles Access Price/ environment/ safe image cost client interface
Business meetings attending Customer business meetings Internet portal development Performance measurement system redesign
0
0
0
2
1
1
1
1
0
0
1
0
1
1
1
1
1
0
0
1
0
1
0
1
1
1
0
0
0
0
0
0
0
0
0
0
Sales department creation to support and improve the subscribers services Booking and service scheduling control functions centralization
2
2
1
1
1
1
0
1
1
1
2
1
1
0
2
0
1
0
Source: authors. Legend: 2, High negative impact;
1, Negative impact 0, Neutral 1, Positive impact 2 High positive impact.
Table 6 WS 10: decision areas gaps statements. WS 10—decision areas gaps statements Decision areas (a) XCom Service design Process technology Facilities Capacity Work force Organization Customer management Performance measurement system (b) YCom Service design Process technology Quality management system Organization Information management systems Supply and materials management Performance measurement system Operations planning and control Continuous improvement system Source: authors.
Gaps descriptions
Strategic alliances development with key technology providers Equipments and systems license acquisition, for commercial purposes Process technology redesign to attend the clients’ new demands New buildings and facilities to be prepared to be an OEM service dealer Work force adjustment (impact on consistency, competence, delivery speed, service environment/client interface and flexibility) There is no developed competence/expertise to attend the clients’ new demands Organizational redesign demand to be coherent to the delivery speed and flexibility competitive criteria references The need of developing managerial controls and systems to support the OEM services Performance measurement system redesign
Sales department creation to support and improve subscribers services (high impact on consistency and competence) ERP—MICRO SIGAs-acquisition (overall impact) Quality management system revision (update and redesign) Quality circles implantation (all organizational units) Booking and service scheduling control functions centralization (high impact on competence and credibility) ERP—MICRO SIGAs-acquisition (overall impact) ERP—MICRO SIGAs-acquisition (overall impact) ERP—MICRO SIGAs-acquisition (overall impact) Management processes decentralization (all organizational units) ERP—MICRO SIGAs-acquisition (overall impact)
ARTICLE IN PRESS 414
E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
Table 7 WS 11: new proposed actions WS 11—new actions proposition Action ID (a) XCom A1 A2 A3
A4
A5 A6 A7 A8 (b) YCom A1 A2 A3 A4 A5 A6 A7 A8 A9
New actions proposition
Increase the strategic alliances, specially with companies that have critical knowledge (intellectual property agreements) Invest in new technologies, hire some experts and develop a focused training. The investments cover technical field work and laboratorial activities. The technologies are related to WLL, optical systems, DMA, TDMA, GSM, and VOIP Develop a priority ranking process to identify ‘urgency’ service requirements Develop actions to improve the client awareness about investing in spare modules, in order to improve their systems liability (XCom is able to invest in spare modules based on long-term contracts) Increase the service capacity through independent professional experts and specialized small companies. Develop a trainee program to prepare the professional partners Develop partnerships for recruitment with technical schools, technical colleges, and universities Develop statistics analysis to identify bottlenecks and to improve the operational efficiency (statistics process control) Develop long-term contracts with professional experts and small companies, based on service demand (create a service network) Operations process optimization, through enterprise engineering design techniques Implant OEM services and their respective infrastructures Integrated enterprise system acquisition and implantation. The system will integrate commercial, purchasing, finance, accounting, human resources, fiscal, and operations functions Organizational integration between commercial and technical assistance and installation areas. The areas will use a common system to forecast service demand Develop process and procedures to control and to manage the end users complaint index Develop strategies to increase profitability of the whole business Review the minimum margin of 10% Increase synergy with clients through social events and meetings Implement regular meetings to analyze data and information about performance results Improve effectiveness of the total quality management system Improve technical assistance service time (BACK LOG reduction time) Manage, through an operational indicator, service delivery times Manage, through an operational indicator, facilities and networks improvement quality
Source: authors.
Fig. 7 presents the matrix results for XCom and YCom cases. An ordination process is started using scores provided by relationship matrix analysis. The refinement process also uses information about the running actions to test consistency and coherence with the new proposed actions. Phase 4 transforms proposed actions in an operations strategy plan and it also sets references for performance measurement system review.
4.4. Phase 4—planning process The resulting set of new actions is transformed in operational action. The procedure used to generate the actions plan is showed in Table 8. The ‘5W+1H’ technique is a self-explained process that can be used to generate first specifications for the actions plan (Goh and Xie, 2004; Tam et al., 2001). Measures are identified to follow the new proposed actions and performance measurement system is reviewed based on procedures established by Neely et al. (1997). Table 9 shows an example for an indicator description/design. Phases were evaluated by their users, during its development, and also at the end of the whole redesign process.
4.5. Evaluating the process All phases are evaluated by the involved actors, using criteria based on feasibility, utility, and usability (Platts et al., 1996; Platts, 1994). Table 10 shows the consolidate process evaluation for cases ‘Xcom’ and ‘Ycom’. The overall index consolidated a satisfaction factor around 80% (including Good and Very Good Scores), for all evaluation criteria. The process is intensive and time consuming. According to users’ feedback in the respective workshops, procedures of worksheets WS8, WS13, and WS15 are suggested for exclusion. A strategic learning process that is developed during the application of the procedures is intrinsically defined. There are some components of the entire process that can be seen in a knowledge operations strategy perspective, particularly defined by the Process Approach technique application and by the strategic management system structure (Yeung et al., 2007; Shaw and Edwards, 2006). The developed process embraces internal and external perspectives of the operations strategy design and implementation, integrating a market-based approach to internal strategy deployment. The gaps identification process is based on those assumptions. This process should be seen in a continuous assessment perspective (Acur and Englyst, 2006; Englyst, 2003). The main characteristic of the developed process is that a theory for action is being developed and tested and
ARTICLE IN PRESS E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
NEW ACTIONS A1 (2 points) A2 (4 points) A3 (3 points) A4 (3 points) A5 (3 points) A6 (0 points) A7 (3 points) A8 (4 points)
NEW ACTIONS A1 ( 7 points) A2 ( 8 points) A3 (12 points) A4 (8 points) A5 (10 points) A6 (5 points) A7 (8 points) A8 (6 points) A9 (4 points)
WS 12 – RELATIONSHIP MATRIX A2 A3 A4
A1
1 1 0 0 0 0 0 1
A1
0 0 1 0 1 1
A2
0 0 1
1 1 0 1 0
1 0 0 1
WS 12 – RELATIONSHIP MATRIX A3 A4 A5
2 2 1 1 1 2 0 0 0
0 0
1 2
2 1 2 0 1 0 0
1 1 2
2 2 1 1 2 1
1 2 2 1
1 0 2 1 0
0 1 1 2
415
A5
A6
A7
A8
0 1 1 1
0 0 0 0 0
0 1 1 0 0 0
1 1 0 1 0 0 1
0 0 0
0 0
1
A6
A7
A8
A9
2 0 1 0 0
0 1 1 2 1 2
0 0 2 1 1 0 1
0 0 1 0 2 0 0 1
2 0 0
1 0
1
Legend -2 High trade-off
-1 Moderate trade-off
0 Neutral
1 Moderate support
2 High support
Fig. 7. Relationship matrix.
Table 8 WS 15: new actions. WS 15—new actions description Validated new action (a) XCom Action A2: invest in new technologies, hire some experts, and develop a focused training. The investments cover technical field work and laboratorial activities. Technologies are related to WLL, optical systems, DMA, TDMA, GSM, and VOIP
(b) YCom Action A5: to increase the synergy with the clients through social events and meetings
Action description (5W+1H)
What: invest in new technologies, hire some experts, and develop a focused training, in a proportion representing 10% of company net incomings Why: update technological infrastructure and employees continuous competencies Where: laboratories units When: April 2005. Who: hardware maintenance service manager How: specific projects financed by new revenue incomings What: improve contacts and empathy with clients Why: identify new opportunities for business and processes improvement Where: places structured and hired for that end When: organize meetings every 2 months Who: coordination is set to the marketing and commercial team How: prepare an agenda for the whole year, identifying companies and people
Source: authors.
this dialectic approach contributes for theory consolidation. The present work contributes for the Slack et al. (2004) considerations about the main role of OM area, which is the continuous reconciliation between theory and practice. In fact, the action plan proposition and the
performance measurement subsystem redesign are practical procedures for testing an operations strategic management theoretical framework. The present methodology contributes for the design, use, and implementation virtuous cycle, establishing the
ARTICLE IN PRESS 416
E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
Table 9 WS 16: performance measures. WS 16—performance measures revision Description (a) XCom Performance measure Purpose Related to Target Formula Frequency Who does the measure? Data source Who manage the results? What do they do?
Net incomings behaviour Track-specific test services, chosen by their contribution to the economic performance and certificated by the clients Laboratory certification (by client) Increase 20% the economic results produced by tests Incomings produced by certified test services/incomings produced by all the tests Monthly (by cumulative results) Laboratory administrative staff Accounting department Sales manager and the main board Monitor, analysis, and decision-making process
(b) YCom Performance measure Purpose Related to Target Formula Frequency Who does the measure? Data source Who manage the results? What do they do?
Client improvement synergy Sustain and identify business opportunities Business development and portfolio improvement Annual agenda Number of new opportunities identified/number of social events Z1 Every 2 months, starting in February 2004 Sales manager Sales department Main board of directors To prospect the potential of new contracts and clients
Source: adapted from Neely et al. (1997). Table 10 (Re)Design process evaluation. Criteria
Very poor (%)
Poor (%)
On average (%)
Good (%)
Very good (%)
2.9 2.8 1.9
62.9 61.1 60.4
34.3 36.1 37.7
7.7 5.0 8.5
71.8 75.0 80.9
17.9 15.0 8.5
WSH 1
Feasibility Usability Utility
WSH 2
Feasibility Usability Utility
WSH 3
Feasibility Usability Utility
11.5 13.0 12.5
53.8 60.9 60.0
34.6 21.6 27.5
WSH 4
Feasibility Usability Utility
6.2 9.1 7.5
69.7 72.7 80.0
24.2 18.2 12.5
Overall process
Feasibility Usability Utility
22.2 16.7 20.0
55.6 77.8 46.7
22.2 5.6 33.3
2.6 5.0 2.1
Source: authors.
strategic management system structure, and the management procedures, the necessary conditions for a sustainable operation development, an operation that can be characterized by an evolutionary perspective (Pandza et al., 2003), by market and technological requirements formal management (Minarro-Viseras et al., 2005), and the development of a multivariable strategic management system (Sahay, 2005; Acur and Bititci, 2004). The performance measurement system redesign according to an action plan ensures a strategic management (strategic control) characteristic for tracking the deliberated operations strategy. The virtuous cycle of reviewing and redesigning the operations strategic management
system also contributes for establishing a consistency feature over time; that is, it creates a long-term path of continuous improvement and learning. This summarizes the refinement process during the methodology development and test and configures the process for future validation tests.
5. Conclusion In general terms, it is important to recognize the strategic learning process that is developed during procedures application. There are some components of
ARTICLE IN PRESS E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
the entire process that can be seen in a knowledge operations strategy approach. Basically, it can be identified as an interactive knowledge diffusion and sharing process, renewing the operations strategy knowledge basis. Strong communication and learning processes are established, founding the change process that is started by the redesign initiative. It is important to point out that both companies have ISO 9000:2000 certified process and available information to be systematized in strategic OM redesign process. They created a methodology for giving consistency to their strategic planning processes, guaranteeing quality standards in management systems, particularly those related to operations strategy and performance measurement. It is important to note that methodology, process development, and implementation are not directly validated by the studied cases; that is, case studies were used for testing and illustrating only the proposed procedures. The new refined process should be tested to construct its internal validation and after that develop their external generalization. These perspectives guide future development for the methodology. The main characteristic of the developed process is that a theory for action is being developed and tested and this dialectical approach contributes for theory consolidation. The present work contributes to a better understanding about the main role of OM area, which is the continuous reconciliation between theory and practice. A simple procedure was used for ranking the action plan, and it is recognized that the developed methodology can be improved, but the procedure satisfied the research needs. The achieved results show how important it is to establish a theoretical framework and a guiding intervening process; both constitute a powerful schema for mobilizing people and resources in reviewing strategy and performance measures. Although, the developed procedures are gap biased, they reveal real conditions for action planning development; that is, they create a real context for ‘problem solving’. The developed procedure reinforces some complementary features of manufacturing strategy paradigms, in the sense that it points out the importance of the strategic fit, particularly at the operations system level, developing a rationality for strategic choices (new actions’ selection procedure) and based on best practices implementation for achieving the stated strategic objectives (gap treatment and internal analysis). References Acur, N., Bititci, U., 2004. A balanced approach to strategy process. International Journal of Operations and Production Management 24 (4), 388–408. Acur, N., Englyst, L., 2006. Assessment of strategy formulation: how to ensure quality in process and outcome. International Journal of Operations and Production Management 26 (1), 69–91. Amoako-Gyampah, K., Acquaah, M., 2008. Manufacturing strategy, competitive strategy and firm performance: an empirical study in a developing economy environment. International Journal of Production Economics 111 (2), 575–592.
417
Argyris, C., 1993. Knowledge for Action: A Guide to Overcoming Barriers to Organizational Change. Jossey-Bass, San Francisco. Brown, S., Fai, F., 2006. Strategic resonance between technological and organisational capabilities in the innovation process within firms. Technovation 26 (1), 60–75. Brown, S., Squire, B., Blackmon, K., 2007. The contribution of manufacturing strategy involvement and alignment to world-class manufacturing performance. International Journal of Operations and Production Management 27 (3), 282–302. Chen, D., 2005. Enterprise-control system integration: an international standard. International Journal of Production Research 43 (20), 4335–4357. Coughlan, P., Coghlan, D., 2002. Action research for operations management. International Journal of Operations and Production Management 22 (2), 220–240. Crowe, T.J., Rolfes, J.D., 1998. Selecting BPR projects based on strategic objectives. Business Process Management Journal 4 (2), 114–136. ˜ a, M.L., Garcı´a Muı´n ˜ a, F., 2007. Structural and Dı´az Garrido, E., Martı´n Pen infrastructural practices as elements of content operations strategy: the effect on a firm’s competitiveness. International Journal of Production Research 45 (9), 2119–2140. Englyst, L., 2003. Operations strategy formation: a continuous process. Integrated Manufacturing Systems 14 (8), 677–685. Faisal, M.N., Banwet, D.K., Shankar, R., 2006. Supply chain risk mitigation: modelling the enablers. Business Process Management Journal 12 (4), 535–552. Fitzgerald, L., Johnston, R., Brignall, S., Silvestro, R., Voss, C., 1991. Performance Measurement in Service Business. CIMA, London. Folan, P., Browne, J., 2005. A review of performance measurement: towards performance management. Computers in Industry 56 (7), 663–680. Frohlich, M.T., Dixon, J.R., 2001. A taxonomy of manufacturing strategies revisited. Journal of Operations Management 19 (5), 541–558. Ghalayini, A.M., Noble, J.S., Crowe, T.J., 1997. An integrated dynamic performance measurement system for improving manufacturing competitiveness. International Journal of Production Economics 48 (3), 207–225. Goh, T.N., Xie, M., 2004. Improving on the six sigma paradigm. The TQM Magazine 16 (4), 235–240. Gomes, C.F., Yasin, M.M., Lisboa, J.V., 2004. A literature review of manufacturing performance measures and measurement in an organizational context: a framework and direction for future research. Journal of Manufacturing Technology Management 15 (6), 511–530. Greasley, A., 2004. Process improvement within a HR division at a UK police force. International Journal of Operations and Production Management 24 (3), 230–240. Gro¨ssler, A., 2007. A dynamic view on strategic resources and capabilities applied to an example from the manufacturing strategy literature. Journal of Manufacturing Technology Management 18 (3), 250–266. Henry, J.F., 2006. Management control systems and strategy: a resourcebased perspective. Accounting, Organizations and Society 31 (6), 529–558. Joshi, M.P., Kathuria, R., Porth, S.J., 2003. Alignment of strategic priorities and performance: an integration of operations and strategic management perspectives. Journal of Operations Management 21 (3), 353–369. Kaplan, R.S., Norton, D.P., 1992. The balanced scorecard: measures that drive performance. Harvard Business Review 70 (1), 71–79. Kaplan, R.S., 1998. Innovation action research: creating new management theory and practice. Journal of Management Accounting Research 10 (1), 89–118. Keegan, D.P., Eiler, R.G., Jones, C.R., 1989. Are your performance measures obsolete? Management Accounting 70 (12), 45–50. Lu, Q., Botha, B., 2006. Process development: a theoretical framework. International Journal of Production Research 44 (15), 2977–2996. Lynch, R.L., Cross, K.F., 1991. Measure Up: The Essential Guide to Measuring Business Performance. Mandarin, London. Maslen, R., Platts, K.W., 2000. Building manufacturing capabilities. International Journal of Manufacturing Technology and Management 1 (4/5), 349–365. Melnyk, S.A., Stewart, D.M., Swink, M., 2004. Metrics and performance measurement in operations management: dealing with the metrics maze. Journal of Operations Management 23 (3), 209–217. Mills, J.F., Platts, K.W., Neely, A.D., Richards, A.H., Gregory, M.J., Bourne, M.C.S., 2002. Creating a Winning Business Formula. Cambridge University Press, Cambridge. Minarro-Viseras, E., Baines, T., Sweeney, M., 2005. Key success factors when implementing strategic manufacturing initiatives.
ARTICLE IN PRESS 418
E. Pinheiro de Lima et al. / Int. J. Production Economics 122 (2009) 403–418
International Journal of Operations and Production Management 25 (2), 151–179. Mintzberg, H., 1978. Patterns in strategy formulation. Management Science 24 (9), 934–948. Munive-Hernandez, E.J., Dewhurst, F.W., Pritchard, M.C., Barber, K.D., 2004. Modelling the strategy management process: an initial BPM approach. Business Process Management 10 (6), 691–711. Neely, A.D., 2005. The evolution of performance measurement research: developments in the last decade and a research agenda for the next. International Journal of Operations and Production Management 25 (12), 1264–1277. Neely, A.D., Gregory, M.J., Platts, K.W., 2005. Performance measurement system design: a literature review and research agenda. International Journal of Operations and Production Management 25 (12), 1228–1263. Neely, A.D., Adams, C., Kennerley, M.P., 2002. Performance Prism: The Scorecard for Measuring and Managing Stakeholder Relationships. Financial Times/Prentice Hall, London. Neely, A.D., Richards, H., Mills, J.F., Platts, K.W., Bourne, M.C.S., 1997. Designing performance measures: a structured approach. International Journal of Operations and Production Management 17 (11), 1131–1152. Nonaka, I., Takeuchi, H., 1995. The Knowledge Creating Company. Oxford University Press, New York. Olhager, J., Rudberg, M., 2002. Linking manufacturing strategy decisions on process choice with manufacturing planning and control systems. International Journal of Production Research 40 (10), 2335–2351. Olhager, J., Selldin, E., 2007. Manufacturing planning and control approaches: market alignment and performance. International Journal of Production Research 45 (6), 1469–1484. Pandza, K., Polajnar, A., Buchmeister, B., Thorpe, R., 2003. Evolutionary perspectives on the capability accumulation process. International Journal of Operations and Production Management 23 (8), 822–849. Platts, K.W., Mills, J.F., Bourne, M.C.S., Neely, A.D., Richards, A.H., Gregory, M.J., 1998. Testing manufacturing strategy formulation processes. International Journal of Production Economics 56–57 (1), 517–523. Platts, K.W., Mills, J.F., Neely, A.D., Gregory, M.J., Richards, A.H., 1996. Evaluating manufacturing strategy formulation processes. International Journal of Production Economics 46–47 (1), 233–240. Platts, K.W., 1994. Characteristics of methodologies for manufacturing strategy formulation. Computer Integrated Manufacturing Systems 7 (2), 93–99. Platts, K.W., 1993. A process approach to researching manufacturing strategy. International Journal of Operations and Production Management 13 (8), 4–17. Platts, K.W., Gregory, M.J., 1990. Manufacturing audit in the process of strategy formulation. International Journal of Operations and Production Management 10 (9), 5–26. Pun, K.F., 2004. A conceptual synergy model of strategy formulation for manufacturing. International Journal of Operations and Production Management 24 (9), 903–928.
Pun, K.F., 2005. An empirical investigation of strategy determinants and choices in manufacturing enterprises. Journal of Manufacturing Technology Management 16 (3), 282–301. Sage, A.P., 1977. Interpretive Structural Modelling: Methodology for Large-Scale Systems. McGraw-Hill, New York. Sahay, B.S., 2005. Multi-factor productivity measurement model for service organisation. International Journal of Productivity and Performance Management 54 (1), 7–22. Shaw, D., Edwards, J.S., 2006. Manufacturing knowledge management strategy. International Journal of Production Research 44 (10), 1907–1925. Silveira, G.J.C., 2005. Market priorities, manufacturing configuration, and business performance: an empirical analysis of the orderwinners framework. Journal of Operations Management 23 (6), 662–675. Skinner, W., 1969. Manufacturing: missing link in corporate strategy. Harvard Business Review 45 (3), 136–145. Slack, N., Lewis, M., Bates, H., 2004. The two worlds of operations management research and practice: can they meet, should they meet? International Journal of Operations and Production Management 24 (4), 372–387. Slack, N., Lewis, M., 2002. Operations Strategy. Pearson Education, Harlow. Slack, N., 2000. Flexibility, trade-offs and learning in manufacturing system design. International Journal of Manufacturing Technology and Management 1 (4/5), 331–348. Sousa, G.W.L., Groesbeck, R.L., 2004. Enterprise engineering: managing dynamic complexity and change at the organizational level. In: Proceedings of the 2004 American Society for Engineering Management Conference, Alexandria, USA. Tam, A.S.M., Chu, L.K., Sculli, D., 2001. Business process modelling in small to medium sized enterprises. Industrial Management and Data Systems 101 (4), 144–152. Tan, K.H., Platts, K.W., 2005. Effective strategic action planning: a process and tool. Business Process Management Journal 11 (2), 137–157. Tan, K.H., Platts, K.W., 2007. Linking operations objectives to actions: a plug and play approach. International Journal of Production Economics, in press, doi:10.1016/j.ijpe.2007.02.032. Teece, D., Pisano, G., Shuen, A., 1997. Dynamic capabilities and strategic management. Strategic Management Journal 18 (7), 509–533. Tranfield, G.J.C., Denyer, D., Burr, M., 2004. A framework for the strategic management of long term assets (SMoLTA). Management Decision 42 (2), 277–291. Yeung, A.C.L., Lai, K.H., Yee, R.W.Y., 2007. Organizational learning, innovativeness, and organizational performance: a qualitative investigation. International Journal of Production Research 45 (11), 2459–2477. Zilbovicius, M., 1997. Modelos para a produc- a˜o e produc-a˜o de modelos. Ph.D. Thesis, University of Sa˜o Paulo, Brazil.