Automation in Construction 32 (2013) 14–23
Contents lists available at SciVerse ScienceDirect
Automation in Construction journal homepage: www.elsevier.com/locate/autcon
Development of CDPM matrix for the measurement of collaborative design performance in construction Z. Ren a,⁎, C.J. Anumba b, F. Yang c a b c
University of Glamorgan, UK/Hong Kong Polytechnic University, Hong Kong The Pennsylvania State University, USA Matt MacDonald, UK
a r t i c l e
i n f o
Article history: Accepted 7 November 2012 Available online 13 March 2013 Keywords: Collaborative design Criteria Design performance Indicator Measurement
a b s t r a c t The widely realized importance of collaborative design and work has led to the development of frameworks/ tools to support collaboration in the construction industry. However, there is a lack of widely accepted indicators and criteria to assess the performance of collaborative designs. This paper aims to develop a matrix which could be used to measure the performance of a collaborative design. The criteria involved will, in turn, provide a guideline for the improvement of the final design output. The research involved a literature review and in-depth focus group workshops. The outcome of the study is a collaborative design performance measurement (CDPM) matrix that addresses 6 indicators and 42 detailed criteria. The matrix can be applied to support design teams in measuring and improving their performance, by reviewing and modifying collaborative design development, identifying the design team strengths and weaknesses, improving communication and suggesting suitable responsive actions. © 2012 Published by Elsevier B.V.
1. Introduction Construction design is a complex engineering activity that requires collaboration between multi-disciplinary design teams where difficult compromises need to be made to achieve a balance between competing objectives such as safety, reliability, performance and cost. A typical building design involves a wide range of disparate disciplines – architecture, structure, building services, quantity surveyors – working together for a relatively short period on the design and construction of a building. More specialists (e.g. seismic, hydraulic and pipeline engineers) will be involved in infrastructure design. Each designer makes decisions based on the design requirements, constraints and inputs from other disciplines. Often, designers are from different countries with different backgrounds and are familiar with different design codes. Due to the fragmented knowledge, the importance of collaborative working within the building design environment is now being recognized as a way of improving design performance and final outcomes. Collaborative design is considered as a process in which design team members actively communicate and work together in order to jointly establish design goals, search through design problem spaces, determine design constraints and construct a design solution (Zha and Du, 2006). In this study, collaborative design includes the situations where different design teams belong to the same or different organizations located together or at a distance. ⁎ Corresponding author. E-mail address:
[email protected] (Z. Ren). 0926-5805/$ – see front matter © 2012 Published by Elsevier B.V. http://dx.doi.org/10.1016/j.autcon.2012.11.019
Much research has been done to improve collaborative design performance in different industries such as construction [55,71,72] and manufacturing ([10,23]; Lahti et al., 2004). Much attention has been given to the establishment of collaborative design frameworks and environments which facilitate information sharing, task coordination, group decision making and conflict resolution with the prevalence of new design theories, technologies and tools ([55,62]; Yang and Bouchlaghem, 2006). On the other hand, researchers (e.g. [26,43,73]) point out that collaborative design can be improved by measuring the performance. However, existing Design Performance Measurement (DPM) frameworks (e.g. [14,69]) are not suitable for measuring collaborative design [43]. A major reason is that DPM frameworks are developed to measure the performance of an individual design team rather than a group of design teams who have diversified knowledge and expertise, represent different disciplines and organizations, and operate in a highly complex and dynamic environment. Other challenges, such as limited or inaccurate information during the design stage, also impede the measurement and improvement of collaborative design performance. Even though some CDPM matrices have been developed in the manufacturing, aviation and new product design (NPD) sectors, they are particularly tailored with consideration of the specific industry environments and strategies (e.g. [40,49,73]) and are not suitable for building design. This research aims to develop a CDPM matrix in order to measure the performance of collaborative design in the construction industry. A DPM matrix provides a compact representation of collaborative design indicators and detailed CDPM criteria under each indicator for measuring collaborative design performance [60]. This research thus focuses on
Z. Ren et al. / Automation in Construction 32 (2013) 14–23
15
Research Activities Literature Review
PotentialCDPM indicators
Workshop
Key CDPM indicators
Further Literature Review
Group Discussion/brain storming
Potentialcriteria for each key CDPM indicator
Suitable criteria for each CDPM indicator
Workshop
Key criteria for each CDPM indicator
Deliverables Fig. 1. Research methodology.
two studies: the identification of the key CDPM indicators and the exploration of the specific criteria within each of the CDPM indicators. Fig. 1 illustrates the research process which includes four major stages. Literature review, group discussion and workshops are the major approaches adopted.
2. Identification of the key CDPM indicators
• Firstly, an extensive literature review was conducted to identify the potential CDPM indicators. Publications are mainly from DPM in building design and CDPM in the manufacturing and NDP sectors. • Secondly, a workshop was organized to identify the key CDPM indicators since it is a quick and effective means of obtaining rich information about participants' opinions and deeper insights. New ideas can be developed based on one another's responses and ground discussion (Krueger, 2000). • Thirdly, a further literature review was undertaken to study the specific criteria for each of the identified key CDPM indicators. Brainstorming within the research team was undertaken to identify and classify the potential CDPM criteria. • Finally, the second workshop was organized to prioritize and validate the detailed criteria for each CDPM indicator based on the results of the literature review and group discussion.
2.1. Investigation of the potential CDPM indicators
The research work included two stages: 1) exploring the possible collaborative design indicators for CDPM and 2) identifying the key indicators.
The potential CDPM indicators were explored through the literature review. Key words such as collaborative design, design performance measurement, design management, building design quality and performance measurement were used to search for the related publications. The sources include journals and books in construction design and management, as well as online sources related to product development, design and engineering management and performance measurement. In general, there are two types of DPM indicators: product-based and process-based (Bruce & Bessant, 2002). The former concentrates on measuring design performance based on the final product – a completed design/product/building – with indicators such as desirability, buildablity, integrity, novelty, usability, esthetics, function, reliability and longevity
Table 1 Summary of the CDPM indicators. Indicators
Examples of detailed criteria
Sources
Collaboration
Communication quality; information sharing; workflow; standards & codes; language barriers; supporting tools. Ability to make compromises; avoidance of sub-optimization; resolution of design conflicts; prediction of conflicts. Establishing a common language; sharing parameters; interaction parameters; functional openness. Clarification of client's requirements; client acceptance; client satisfaction; opportunity for future project; numbers of clients; value to the client; number of complaints from the client. Design process plan; design stages; design methods; constant review; progress meetings. Meeting design quality requirements; understanding design brief and rationale; number of design options generated; number of design reviews; use of effective tools for design analysis and communication. Design development time; application of computer-aided design; productivity; R&D efficiency. Adoption of new technologies; innovative solutions to design problems; sustainable competitive advantage. Incentive for collaboration; risk management; decision making; planning & control; conflict management; resource management. Organization structure; vision; profit goals; market share target; policy; internal/external relations. Meet the quality requirements, functionality, performance, design adaptability, design flexibility; uniqueness; traceability; constructability; environmentallyfriendly; sustainable competitive advantage; social impacts. Technical know-how improvement, knowledge capture, learning approaches, learning objectives, incentives for learning and sharing knowledge, exploring and acquiring skills, learning curve, self-learning. Leading to future projects; ability to track design market trends; awareness of technologies changes; understanding of business culture; environmental and regulatory requirements.
[10–13,17,26,30,55,64,72]; Kvan (2000) [17,54,72]; Anumba et al. (2003)
Conflict resolution Cross-disciplinary Integration Meeting the client's requirements Development process Effectiveness
Efficiency Innovation Management Organizational factors Product-level indicators/Quality indicators Learning
External/design market environment
[9,16,17,48,65,72] Formoso et al. (2011; Yu et al., 2011; [5,7,26,30,38,39,50,58,61,65] [17,32,44,50,52,70] [8,32,34,48,50–52,54,59]; Campion and Medsker (1993) [36,54]; Hultink and Commandeur (2003); [32,46] [2,5,6,18,20,24,38,48,57,67]. [1,6,21,65,70]; Cooper (2003); [11,41–43] [1,5,6,10,24,48]; Leenders and Wierenga (2002); [11,13,34,41] [2,14,15,19,24,28,30,33,47,51,68]
[3,4,11,22,53,55,66]; Busby (1999); [27]; Cohen et al. (1996); [29] [6,18,33,34,36,50,52,63,67]
16
Z. Ren et al. / Automation in Construction 32 (2013) 14–23
Table 2 Workshop participants. Role
Seismic analysis Hydraulic engineer Architect
Number of 1 participants Countries Italy
Structural engineer
M&E engineer Building services IT support Client
Researcher
5
3
5
2
3
Norway (1), UK (1)
UK (2), UK (2), Denmark (1), Malaysia (2), Sri Lanka (1) Malaysia (2) Sri Lanka (1)
[14,45]. The latter focuses on measuring design performance derived from the design process, with indicators such as effectiveness, efficiency, learning, communication, collaboration and management ([73]; Maier et al., 2006). Both types of indicator have been developed in the construction industry (e.g. [14,19,37,69]). For example, [14] identified three DPM indicators to assess the quality of design, which are: functionality (the arrangement, quality and interrelationship of spaces and how the building is designed to be useful to all), quality (the engineering performance of the building, which includes structural stability and the integration, safety and robustness of the systems, finishes and fittings) and impact (the building's ability to create a sense of place and have a positive effect on the local community and environment). [69] explored the design (process) performance from both the design firm and the project's points of view. They identified financial and cost-based indicators, design reviews and quality indicators, time-based indicators, client feedback, benchmarks with competitors and measuring out-sourced design. Based on these, they identified six design performance areas: client needs, integrating the project into
3
2
3
UK
UK
UAE (1), UK (3), China (1), Sri Lanka (2) Hong Kong (1)
design aims, project design processes, external design, profitability and efficiency of projects, and learning and innovation. They concluded that current DPMs were unreliable and patchy and lacking in factual information. No CDPM study has been done in the construction industry. Given the limited literature about construction design, the literature review was expanded to other sectors (e.g. NDP, manufacturing, aircraft and IT) where CDPM tools have been developed from existing DPM research such as modeling of design development performance (e.g. [51]) and guidelines for performance measurement design (e.g. [25,49]). Most of the CDPM studies concentrate on exploring indicators for measuring product design and detailed criteria for measuring these indicators such as efficiency, effectiveness, collaboration and the novelty of the technology [5,36,51,67,73]. 13 indicators which highlight the key issues for collaborative design were derived from the literature review in construction, NDP and manufacturing design (Table 1). For instance, design efficiency supports the design team's delivery of high quality products and services on time and at a low cost [48]. Innovation is critical in building
Table 3 Potential criteria identified. Detailed criteria Client's involvement
Collaboration
Efficiency
Effectiveness
Product-based indicators
Innovation and learning
Mission, objectives, vision, strategies, business need, ability of briefing, capturing client's requirements, interpreting or analyzing the requirements, helping clients to clarify their needs, level of client acceptance/satisfaction, opportunity for future projects, numbers of clients, value to the client, number of complaints from the client, client's leadership, clients driving innovation, client's value perception, tools and methods for briefing clients, contract signed with the client, client's requirements on value engineering, client's knowledge, public or private, client's monitoring process, client's assessment criteria, level of involvement, level of interference, business nature, regulatory requirements, environmental requirements, function analysis, client's payment, time allowed by the client for the design. Ability to make compromises, shared problem-solving, clear team goals, quality assurance system, environment for open dialog, communication quality, communication approach, conflict management, collaborative decision-making, cross-functional collaboration, clear roles and responsibilities, stress management, establishing a common language, functional openness, good technical know-how, high morale to collaborate, informal network position, information processing, performance measurement and monitoring, self-presentation, trust, task interdependence, team satisfaction, team-justification, understanding of each other's protocols, working with enthusiasm, cross-border issues, low turnover of personnel, involvement of contractor or supplier, regular meetings, non-adversarial environment, the timeframe strictly observed, fair pain/gain share, a good record of collaborating on previous projects, team spirit, reasonable profit margin, fair risk allocation, relationships managed, early warning systems, streamline administrative processes, promoting long-term relationships, focusing on better value, effective information sharing, encouraging innovation, win/win outcome, a holistic view of parties' positions, transparency, physical distance. Ability to work under pressure, work planning, identifying deviations from plan, effective procedures to meet the schedule, decision-making efficiency, simplifying complex design questions, adoption of effective design-support tools, familiarity with the design, exploring and acquiring skills, learning curve, self-learning, self-knowledge, information recalling, information sharing speed, timeliness of feedback, means of communication, concurrent working, perceived time efficiency, phased design review process, efficient problem solving, process adaptability, process formality, process knowledge, project duration, R&D process well planned, efficient resource usage, time available to study, penalty for delay. Adoption of new technologies, client's leadership, understand design rationale, delivering to the brief, computer-aided design & engineering, concurrent engineering, design quality guidelines met, early involvement of contractor, early supplier involvement, use of prototypes, establishing common database, external think tank, adopting value engineering, fast and detailed feedback, linking authority and responsibility, high quality of joint-supplier design, identifying improvement actions for future projects, improving causal process models, managing mistakes, number of design reviews, number of milestones, normative influence, overall program success, perform root-cause analysis, personally responsible/work ownership, risk adjustment, self-justification, self-preferences, alignment of design codes, testing concept's technical feasibility. Functionality (e.g. capacity, density, privacy, use of site, spatial qualities, orientation, acoustics, artificial and natural light, finishes and fittings), quality (e.g. meeting quality standards, performing according to the client's requirements, material quality, structural stability and elegance, robustness, reliability and durability), buildability, completeness, flexibility and adaptability, performance and usability (e.g. thermal comfort, maintenance, integration, user control, value to the users, connectivity, accessibility), traceability and responsibility, distinctiveness and attractiveness (e.g. character, novelty, external form, vision, symbolic fit, desirability), impact on the public (e.g. enrichment to environment, sustainable and ecological influence on the local community and other stakeholders, urban and social integration, civic contribution), sustainability (e.g. minimizing waste of materials, pollution both in construction and in use, increasing energy efficiency, reducing whole-life costs to manage, clean and maintain), security, safety and health (e.g. internal environment, safe to construct and use, integration of health and safety, influence to users both spiritually and physically, be attractive and healthy for users and the public), regulation and legibility. Contribution to design goals, competitive advantages, competitive reaction, enhancing client acceptance creatively, delivering client needs, leading to future opportunities, market chance, market newness, market familiarity, market potential, meeting quality guidelines, newness to clients, better value for money, incentive for innovation, design uniqueness, right innovative concept for implementation, technical feasibility, technology novelty, time-based competition, risk avoidance, reward for learning, cross-discipline learning, self-learning, cross-organization learning, learning skill, learning culture, company's priority on learning, learning strategies, learning approach, learning outcome, knowledge accumulation, dissemination of learning.
Z. Ren et al. / Automation in Construction 32 (2013) 14–23
17
Table 4 Participants in the second workshop. Role
Foundation engineer
Hydraulic engineer
Architect
Structural engineer
M&E engineer
Building service
IT support
Client
Contractor
Researcher
Number of participants Countries
1
3
2
5
3
2
2
2
2
4
US
Norway (1), UK (1), US (1)
UK
UK (2), Denmark (1), Malaysia (2)
Sweden (2), Sri Lanka (1)
UK
UK
UAE (1) Sri Lanka (1)
Hong Kong
UK (3), China (1)
competitive advantages and can contribute significantly to a firm's sustainable growth and profitability [56].
• Can any of these indicators be integrated or removed or are there any other indicators that should be added? • Can these indicators be measured?
2.2. Identification of the key CDPM indicators To identify the key CDPM indicators in the construction industry, a workshop was organized on 14th and 15th of Sep. 2010. Out of 62 invitees, 22 designers and 5 researchers took part in this event (Table 2). To ensure the quality of the information obtained from the participants, designers were required to have at least five years working experience and to have participated in two collaborative design projects. The projects involved include residential and commercial buildings, factories, water treatment plants, bridge, pipelines and hydraulic projects. Most of the designers were recommended by each other through their early collaborative projects. Researchers were found through their publications in areas such as the development of collaborative design platforms and collaborative working in design. The agenda, reading materials and questions were sent to the participants a week before the workshop. Eleven participants attended the workshop in person while the others were involved through live online links. Due to the difficulties in finding a common slot for all the participants, the actual workshop was conducted as three separate sections. Each section included three sessions: 1) A presentation was made by the authors giving the aims of the research; collaborative design practice, models, frameworks and supporting tools; DPM in construction and DPM/CDPM in other industries, especially in NPD. 2) A general discussion was undertaken with all the participants about the key characteristics, requirements, issues and problems of collaborative design based on their knowledge. 3) Potential indicators identified in the literature review were then discussed. These focused on four questions: • What is the core meaning of each indicator? • Are these indicators appropriate to assess the CDPM in the construction industry?
4.5 4 3.5 3 2.5 2 1.5 1 0.5 0
The 13 potential indicators were discussed from both industry and research perspectives. Participants reached a high level of agreement with most of the indicators. For example, indicators such as organizational factors and market environment were considered not as important as the other indicators in measuring collaborative design performance in the construction industry; indicators such as collaboration and cross-functional integration were integrated due to their high correlation. On the other hand, opinions were very diverse for few indicators (e.g. collaboration and design management) as participants argued that the definition and scope of these indicators were not clear. Eventually, six indicators were considered as the most important for CDPM in construction, which are: client's involvement, collaboration, efficiency, effectiveness, design outcome and innovation. Compared with the research in other industries, the assessment of collaborative design performance in construction has different focuses. For example, [9,48], and Hull et al. (2004) identified efficiency and effectiveness as the most important performance measurement indicators for collaborative design. Yin et al. (2010) identified efficiency, effectiveness, collaboration, management skill and innovation as key CDPM indicators in NDP. However, this study suggested that the level at which client needs have been met and design outcomes are also essential to CDPM in construction. 3. Investigation of detailed CDPM criteria In order to identify the detailed criteria for each CDPM indicator, a further literature review and group discussion were conducted by the research team and a second workshop was organized with professionals. The former is used to identify the potential criteria for each CDPM indicator and for exploring their appropriateness; the latter is adopted to screen and validate the most important criteria for each CDPM indicator.
3.83 3.54 3.15 2.82 2.41
2.22 2.1
2.011.92 1.89 1.81
1.72 1.61 1.55 1.42
1.2 1.11 0.93 0.9
0.71
0.55 0.43
Fig. 2. Criteria for measuring client's involvement in collaborative design.
0.340.22
0.06
18
Z. Ren et al. / Automation in Construction 32 (2013) 14–23
4 3.5
3.71 3.43 3.12
3
3.01 2.99 2.91
2.5
2.85 2.61 2.55 2.4 2.34
2.2 2.18 2.13
2 1.5
2.01 1.84 1.77 1.75 1.52
1.4 1.31 1.11 0.89
1
0.45
0.5
0.21
0
Fig. 3. Criteria for measuring collaboration in collaborative design.
3.1. Investigating the potential CDPM criteria The six indicators were used as keywords to search the literature for potential criteria. 286 criteria were initially identified. After group discussion, 170 were considered as suitable criteria for measuring the six CDPM indicators (Table 3). 3.2. Identifying the key CDPM criteria As pointed out by [31], DPM measurement criteria should be simple, specific, easy to implement and with clear responsibilities. The 170 criteria are still far too many to apply as an efficient CDPM matrix. There is a need to identify the most important ones in order to make the matrix applicable. As a result, a second workshop was organized in Nov. 2010. The same principles were applied when selecting workshop participants. 26 designers and researchers attended this workshop (Table 4). 17 of them had attended the first workshop. This workshop was also organized as in-person and virtual sections.
The workshop started with a briefing on the six CDPM indicators and 174 potential criteria. Five key questions were then thoroughly discussed in order to validate and rank the criteria: • • • •
What are the core meanings of each potential criterion? Is each identified criterion appropriate to measure the indicators? Does a criterion have a close correlation with others? Are these criteria adequate to describe each indicator, if not, do any other criteria need to be added? • What is the ranking of the criteria?
Through the discussion, the detailed criteria were condensed to 131 nos. In order to rank the criteria, a weighting system was adopted: A ¼ NR ¼ ∑si¼1 c ri=N. Where A represents the average ranking for each criterion, R corresponds to the sum of the ranking scores received for each criterion from the participants, ri is the individual ranking value (1–5), c represents the participant's confidence level for his/her ranking and N is the number of participants who ranked a criterion. Figs. 2 to 7 present the rankings for the top 25
3.5 3
2.96 2.88 2.61
2.5
2 1.5 1
2.52 2.51 2.17 2.06 2.041.99 1.94 1.811.73 1.66 1.57 1.411.381.38 1.28 1.11 0.97 0.82 0.8
0.5 0
Fig. 4. Criteria for measuring effectiveness in collaborative design.
0.71 0.54 0.32
Z. Ren et al. / Automation in Construction 32 (2013) 14–23
19
3.5 3.03
3 2.5
2.71 2.47 1.89
2
1.62 1.58
1.5
1.53 1.471.45
1.42 1.41 1.35
1
1.24 1.11 1.06 0.92 0.67 0.59
0.54
0.5
0.45 0.4
0.36 0.32
0.23 0.18
0
Fig. 5. Criteria for measuring efficiency in collaborative design.
criteria in each of the indicators. The top seven criteria were selected for each indicator (note: a short version of top five criteria has also been developed).
and needs, leadership, level of interference, assessment criteria and possible changing requirements directly affect the design team's effort and outcomes.
3.2.1. Client's involvement Fig. 2 illustrates the ranking of the criteria for measuring the client's involvement in collaborative design. Client needs and objectives, capturing client's requirements, helping client to clarify his needs, level of client's involvement/interference, client's ability to provide a briefing, client's leadership, and certainty of client's needs/change initiated by client during design are the top seven criteria. This result emphasizes that the client's needs and objectives form the most important factor guiding collaboration between different design teams, followed by the design team's ability to capture such requirements and the capability to help the client to clarify his needs. These criteria remind the design team that they should focus on capturing, understanding and tracing the client's needs during the design. Compared with other industries (e.g. manufacturing and NDP), the client's involvement plays a much more important role in building design. For example, in the NDP sector, the design of the product is initiated by the market requirements (which in turn are based on the understanding of the potential client's requirements). The client's involvement in the design process is not direct. On the other hand, the client is involved in a construction project from the conceptual stage through the design and construction processes. His or her objectives
3.2.2. Collaboration Fig. 3 illustrates the ranking of the criteria for the collaboration indicator. Clear/common team goals, effective information sharing, shared problem solving, collaborative decision-making, ability to make compromises, cross-functional collaboration and conflict management are the top seven criteria. This result is basically consistent with CDPM studies in other industries (e.g. [73]). It shows that collaboration in design must be led by these key factors, which is not much relevant to the industry (e.g. construction, NPD, manufacturing). On the other hand, compared with [35] and Adepoju (2011) who studied key performance indicators for collaborative working during the construction stage (the former focuses on the collaboration between the client, contractors and suppliers; the latter studies the collaboration between designers and contractors), it is found that factors such as risk sharing, team integration, trust-building and contract issues are less important in collaborative design than in collaborative construction whilst other factors such as having a common goal, effective information sharing and conflict management are common; collaborative decision making and shared problem-solving are more important in collaborative design than in the normal collaborative working. In other words, collaborative
3.5 3 2.5 2 1.5
3.23 2.89
2.76 2.47
2.24
2.03
1.96
1.93 1.63
1.56
1.35
1.17 0.86
1
0.54
0.5 0
Fig. 6. Criteria for the design outcome indicator.
0.31
20
Z. Ren et al. / Automation in Construction 32 (2013) 14–23
3.5 3.03
3
2.9
2.81 2.64 2.5
2.5
2.23 2.11
2
1.99
1.84
1.751.71
1.54 1.42
1.5
1.47
1.35 1.23
1.22 1.11
1
0.84 0.56
0.5
0.54 0.41
0.24
0
Fig. 7. Criteria for measuring innovation.
design is technology-oriented whilst collaborative working between contractors involves more contractual issues.
3.2.3. Effectiveness Fig. 4 illustrates the rank of the importance of the criteria for the effectiveness indicator. Delivering to the brief, design quality guidelines met, good technical know-how, fast and detailed feedback, managing mistakes, computer-aided design and clear role and responsibility are regarded as the most important criteria for design effectiveness. The overall result is consistent with early research in NPD (e.g. [73]). For example, delivering to the brief was ranked as the most critical element of design effectiveness measurement, which echoes those of [33,48]. This indicates delivering to the brief is an important element not only for the effectiveness in NDP but also in construction. Despite of this, some participants did not agree including this criterion here as, in their opinions, delivering to the brief has been included in the client's involvement and design outcome indicators; meeting design quality requirements/specifications can better represent this criterion
for this indicator. Good technical know-how of the design team is ranked as the third most important criterion. This is different from previous study results. When asking for reasons, participants pointed out that good technical know-how in each design team was essential for collaborative design in construction. Closely following this factor, fast and detailed feedback and managing mistakes obtain higher rankings than other criteria. This result is evidenced by [55,73].
3.2.4. Efficiency As shown in Fig. 5, efficient decision-making, efficient problem-solving, appropriate design support tools, planning and control, efficient resource usage, ability to work under pressure, and familiarity with the design were ranked as the most important criteria for design efficiency. Among the top criteria, efficient problem-solving, efficient decision-making and design support tools obtain much higher scores than the others. The first two criteria are consistent with the findings of Yin et al. (2010). As pointed out by [59], it is difficult to make a right decision efficiently due to the competitive pressures, limited resources and accelerating costs in
Table 5 CDPM matrix. Most Important
Less Important Level of client's involvement/ interference Collaborative decision making
Client's ability in briefing Ability to make compromises
Certainty of client's needs/change initiated by client during design Cross-functional Conflict management collaboration
Fast and detailed feedback Planning and control
Managing mistakes Efficient resource usage
Computer-aided Clear role and responsibility design Ability to work Familiarity with the design under pressure
Performance and usability Better value
Buildability
Traceability and responsibility Responding to competition
Client's Client needs and involvement objectives
Capturing client's requirements
Helping client to clarify his needs
Collaboration
Clear/common team goals
Shared problem solving
Effectiveness
Delivering to the brief
Efficiency
Efficient decision-making
Effective information sharing Meeting design quality guidelines Efficient problem solving
Design outcomes Innovation & learning
Acceptance by the client
Functionality
Good technical know-how Appropriate design support tools Quality
Achieving building function/performance goals Company's priority on learning
Delivering client needs
Technical feasibility
Cross-discipline learning
Cross-organization Learning from learning self-experience
Gaining competitive advantages Learning outcomes
Client's leadership
Security safety and health Risk avoidance/mitigation
Learning culture Reward of learning
Z. Ren et al. / Automation in Construction 32 (2013) 14–23
collaborative design. Moreover, decisions made by one party will affect or be affected by other parties'. Therefore, efficient decision-making is crucial for the efficiency of collaborative design. As multi-disciplinary teams are involved in collaborative design, collaborative problemsolving is also critical but much more difficult to achieve due to different technical know-how, interests and constraints. This finding is supported by Hughes et al. (2010). Appropriate design support tools are regarded as the third most important criterion. It reveals that modern design tools play a significant role in improving design efficiency. 3.2.5. Key criteria for measuring design outcomes Fig. 6 illustrates the ranking of the detailed criteria for measuring the design outcome indicator. Acceptance by the client, functionality, quality, performance and usability, buildability, traceability and responsibility and security, safety and health are the top seven criteria. Unlike [14], who identified functionality and quality as the most important criteria to assess the quality of building design, participants in this study ranked acceptance by the client as the most important criterion. This reflects the ‘reality’ aspects of the industry (i.e. functionality or quality is not as important as acceptance by the client). Traceability and responsibility of the design is also regarded as one of the top criteria as it is important to find who would be responsible for a mistake should one occurs. One example given by a participant is that an arbitration case they are involved in is caused by the poor traceability and responsibility between design partners. Some incorrect design parameters given by three different disciplines eventually led to the failure of a pipeline project, however, which of them should bear the responsibility could not be identified. During the discussion process, the question whether design outcomes should be an indicator for CDPM was reiterated. Participants pointed out that the above criteria should be applied to both the completed design and middle design outcomes during the design process. 3.2.6. Criteria for measuring innovation and learning in collaborative design Fig. 7 presents the ranking of the criteria for measuring innovation in collaborative design. Achieving building function/performance goals, delivering client needs, technical feasibility, better value, gaining competitive advantages, responding to competition, and risk avoidance/mitigation are ranked as the top seven criteria for innovation. Unlike studies in NPD (e.g. Griffin & Page, 1993; [24,32,73]) which identified competitive advantage as the most relevant and important criterion, participants suggested that the essential objective of innovation in collaborative building design was to achieve functional and performance requirements and deliver the client's needs. This is because studies in NPD mainly focus on individual company or a product to sell; innovation is regarded as a key to gain market competitiveness. However the client is known in collaborative building design. Innovation in this process is mainly to meet the project target, client's needs and thus a short-term based. This is supported by the third most important criterion — technical feasibility which is much higher than the criteria measuring the potential benefits of innovation (9th–12th). This is also reflected with the high rank of risk avoidance/mitigation. On the other hand, gaining competitive advantage and responding to competition are ranked as the fourth and fifth important criteria. This shows that participants still agree that innovation behavior depends on whether the outcome of the design can provide competitive advantages. Innovation in design is essential for meeting the long-term goals of most firms. Although learning was not regarded as the most important indicator, participants highlighted the links between innovation and learning. As pointed out by [29], firms need to become as close to ‘learning organizations’ as possible in order to be effective and innovative in consulting engineering. Company's priority on learning, cross-discipline learning, cross-organization learning, learning from self-experience, learning outcomes, learning culture, and reward of learning are the top seven criteria for the learning indicator. This indicates that a company's priorities on
21
learning, learning from other parties or learning from the overall team are more than learning from its own experience in a collaborative building design. Ability to learning new design knowledge and skills from partners, clients, suppliers and experience can assist firms to become more agile and responsive to changing client needs, as explained by a design manager. 4. CDPM matrix Based on the above studies, a CDPM matrix was developed, which contains the six indicators and 49 detailed criteria (Table 5). Compared with DPM matrixes developed in other sectors (e.g. [73]), there are several major differences in this CDPM. Firstly, the matrix highlights the importance of client-related aspects which demonstrates the fundamental differences between the construction and NPD/manufacturing sectors; secondly, collaboration is much emphasized due to the complex and dynamic nature of construction design; and thirdly, learning is attached to innovation as an indicator with the emphasis of cross-organization and disciplinary learning. Moreover, most of the detailed criteria in each indicator are different. The proposed CDPM matrix can be used in assessing the collaborative design performance either for the design process or the overall design outcomes. Based on the CDPM matrix, the collaborative design performance can be measured in terms of the client's involvement, collaboration, efficiency, effectiveness, product-based indicators, innovation and learning capacity. The results will reveal the design teams' strengths and weaknesses which support the design teams in better monitoring, understanding and improving the design process, developing better collaborative plan, and making decisions more efficiently and effectively. During the collaborative design process, there is a high level of uncertainty in obtaining the necessary and accurate information to assess the design performance. For the successful application of the CDPM matrix, both objective and subjective information need to be gathered and integrated in order to provide substantial and constant information concerning collaborative design. • Objective information includes solid data about design tasks, process information and outcomes such as time, cost, outputs and statistical data. Such factual information can be collected from design operation records (e.g. design schedule, correspondence, minutes, project brief and plan, budget and results of each design task). This factual evidence can be utilized to measure whether a member meets a design target on time and within budget, whether the team provides fast and detailed feedback and whether the team supplies rich information-sharing. • Subjective information covers the information from design teams' experience and subjective judgment such as trust-building, communication quality and team interaction. The subjective information can be gathered through several means (e.g. a CDPM questionnaire). Multifeedback approaches should be adopted in order to minimize any biased judgment during this process. For example, a team member's performance should be evaluated collectively by all the team members through the CDPM questionnaire, based on the particular design environment and project context. The CDPM matrix could be implemented by following the processes below: 1) Team forming: identify who should be responsible for collecting and assessing the collaborative design performance from each design team (e.g. managers, engineers). 2) Data collection plan: develop plan to clearly address what, how and when the objective and subjective information should be collected, and communicated between different teams. 3) Data collection and analysis: collect the information according to the plan, and analyze the information with agreed methods.
22
Z. Ren et al. / Automation in Construction 32 (2013) 14–23
4) Improvement actions: develop improvement plan based on the analysis results, and take improvement actions immediately. 5) Continuous checking: repeat steps 3 & 4 for several iterations (e.g. two to three times) to check whether the response actions have made positive improvements to the design development. Given the complexity of collaborative design, the proposed CDPM matrix and the implementation plan should be used as guidelines to lead the CDPM operation. The detailed collaborative design performance indicators, criteria and related information should be identified by the design teams according to the specific project features such as project nature, size, number of design team, history of collaboration, and design strategies. The next stage of this research will focus on the automation of the CDPM matrix which will greatly enhance the efficiency of the system. For example, a multi-agent system enabled CDPM matrix will be able to support the design teams to tailor the indicators and criteria to each individual design project, collect and synthesize the objective and subjective information and make suggestions for improvements. The key for the development of the intelligent agent supported CDPM matrix will be the agents' collaboration and learning mechanisms. 5. Conclusions This study developed a CDPM matrix to measure collaborative design performance in construction. Six performance indicators (client's aspects, collaboration, effectiveness, efficiency, design outcomes, and innovation and learning), addressed by 49 detailed criteria, have been identified as the most critical factors for CDPM. The research indicates that client needs and objectives and capturing the client's needs are the most important criteria for the client's aspects; clear and common goals and effective information sharing for collaboration, delivering to the brief and meeting design quality guidelines for effectiveness; effective decision-making and efficient problem solving for efficiency; acceptance by the client and functionality for design outcomes and achieving building function/performance goals and delivering to client's needs for innovation. This result is basically consistent with findings in other studies (e.g. [24,69,73]) and highlights the importance of the client-related aspects in collaborative design in construction. Collaborative design in construction involves participants from different disciplines, organizations or countries with diversified fields of expertise, the difficulties in collaborative design cannot be underestimated, especially for large and complex projects. The developed CDPM will be an effective tool to measure the collaborative design performance in terms of the client aspects, collaboration, effectiveness, efficiency, output and innovation. The results will demonstrate the status of design performance, the strengths and weaknesses of the design teams and, most importantly, the level of collaboration among the teams. The CDPM will lead to a better understanding of the client's requirements, minimized conflicts, reduced risks and improved design outcomes. The design team will also have a better knowledge of the design performance and improve the design process, provide an appropriate training plan, improve the weaknesses and make decisions more efficiently and effectively. Several limitations have been observed during the study process. For example, • Many terms are involved in this study. Participants often define these terms differently (e.g. functionality and performance, innovation, collaboration and management). Accurate definitions should be provided with the CDPM. • Although it is expected that practitioners and researchers would rank the indicators and criteria differently, the degree of the difference was under-estimated by the authors. For example, factors related to sustainability obtained much higher ranking from researchers than practitioners. Given the large number of practitioners in the workshops, their opinions out-weighted the researchers'. Therefore, sustainability has a low rank in the indicators such as design outcomes
and innovation and learning. Similar problems were observed between designers for buildings and infrastructures, as well as the designers from the UK and other countries. These all suggest that a large pool of participants is necessary in the future. • The opinions about what type of information should be gathered for different design stages and different projects also varied. This reveals that it is difficult to develop a universal CDPM for building design. Therefore, the proposed CDPM matrix should be used as a guideline to lead the assessment in different collaborative design projects based on available factual and subjective information. • The CPDM developed in this study still needs to be validated with real industry cases. For example, the questions such as how many criteria (e.g. five, seven or more) should be adopted for each indicator can only be confirmed through real-world industrial implementation.
References [1] J. Andersen, M. Nycyk, L. Jolly, D. Radcliffe, Design management in a construction company, in: David Radcliffe, Josh Humphries (Eds.), 2005 ASEE/AaeE 4th Global Colloquium on Engineering Education, Sydney, Australia, 1–10, 2005, pp. 26–29. [2] J. Alegre, R. Lapiedra, R. Chiva, A measurement scale for product innovation performance, European Journal of Innovation Management 9 (4) (2006) 333–346. [3] J. Barlow, Innovation and learning in complex offshore construction projects, Research Policy, Special Issue 29 (7–8) (2000) 973–989. [4] J. Bento, J. Duarte, M.V. Heitor, W.J. Mitchell, Collaborative Design and Learning: Competence Building for Innovation, Praeger Publisher, USA1-56720-545-3, 2004. [5] C. Bart, A. Pujari, The performance impact of content and process in product innovation charters, Journal of Product Innovation Management 24 (1) (2007) 3–19. [6] V.S. Bettina, Managing Innovation, Design and Creativity, John Wiley & Son Ltd., UK, 2008. [7] U. Bititci, Integrated Performance Measurement System: An Audit Approach, 2002 (Parts 1 and 2: control, February-March). [8] Bomel Limited, Improving the effectiveness of the construction design and management regulations 1994 — establishing views from construction stakeholders on the current effectiveness of CDM, RR538, Research Report, HSE Books, 2007. [9] E.U. Bond, B.A. Walker, M.D. Hutt, P.H. Reingen, Reputational effectiveness in cross-functional working relationships, Journal of Product Innovation Management 21 (2004) 44–60. [10] L. Bstieler, Trust formation in collaborative new product development, Product Innovation Management 23 (2006) 56–72. [11] M.A. Busseri, J.M. Palmer, Improving teamwork: the effect of self-assessment on construction design teams, Design Studies 21 (2000) 223–238. [12] N.Y.W. Cheng, Approaches to design collaboration research, Automation in Construction 12 (6) (2003) 715–723. [13] M.L. Chiu, An organizational view of design communication in design collaboration, Design Studies 23 (2002) 187–210. [14] CIC, How Well is Your Building Designed? Construction Industry Council Publication, 2003. (URL: http://www.dqi.org.uk/DQI/Common/031001_Launch.pdf). [15] M. Cook, The Design Quality Manual: Improving Building 492 Performance, Wiley, Australia, UK, USA978-1-4051-3088-2, 2007. [16] R.G. Cooper, E.J. Kleinschmidt, Benchmarking the firm's critical success factors in new product development, Journal of Product Innovation Management 12 (1995) 374–391. [17] R. Cooper, G. Aouad, A. Lee, S. Wu, A. Fleming, M. Kagioglou, Process Management in Design and Construction, Wiley-Blackwell, Australia, UK, USA, 2004. [18] E. Danneels, E.J. Kleinschmidt, Product innovativeness from the firm's perspective: its dimensions and their relation with project selection and performance, Journal of Product Innovation Management 18 (2001) 357–373. [19] DQI, Design Quality Indicator, http://www.dqi.org.uk/website/default.aspa. [20] A. Dubois, L.E. Gadde, The construction industry as a loosely coupled system: implications for productivity and innovation, Construction Management and Economics 20 (7) (2002) 621–631. [21] S. Emmitt, Design Management for Architects, Blackwell Publishing ltd., UK, USA, 2007. [22] M. Evers, Mechanisms to support organisational learning: the integration of action learning tools into multidisciplinary design team practices, in: Proceedings of the 3rd 507 European Conference on Organisational Knowledge, Learning and Capabilities, Athens, Greece, 2002. [23] L.Q. Fan, A.S. Kumar, B.N. Jagdish, S.H. Bok, Development of a distributed collaborative design framework within peer-to-peer environment, Computer-Aided Design 40 (9) (2008) 891–904. [24] D.R. Fell, E.N. Hansen, B.W. Becker, Measuring innovativeness for the adoption of industrial products, Journal of Industrial Marketing Management 32 (2003) 347–353. [25] P. Folan, J. Browne, A review of performance measurement: 514 towards performance management, Computers in Industry 56 (2005) 663–680. [26] F.G.L. Forme, V.B. Genoulaz, J. Campagne, A framework to analyse collaborative performance, Computers in Industry 58 (2007) 687–697.
Z. Ren et al. / Automation in Construction 32 (2013) 14–23 [27] D.M. Gann, A. Salter, Learning and innovation management in project-basedfirms, in: Proceedings of the 2nd 519 International Conference on Technology Policy and Innovation Lisbon, 1998. [28] D.M. Gann, A.J. Salter, J.K. Whyte, Design quality indicator as a tool for thinking, Building Research and Information 31 (5) (2003) 318–333. [29] D.A. Garvin, Building a learning organization, Harvard Business Review (1993) 78–92. [30] P. Girard, V. Robin, Analysis of collaboration for project design management, Computers in Industry 57 (2006) 817–826. [31] S. Globerson, Issues in developing a performance criteria system for an organisation, International Journal of Production Research 23 (1985) 639–646. [32] A. Griffin, L. Page, PDMA success measurement project: recommended measures for product development success and failure, Journal of Product Innovation Management 13 (1996) 478–496. [33] S. Hart, E.J. Hultink, N. Tzokas, H.R. Commandeur, Industrial companies' evaluation criteria in new product development gates, Journal of Product Innovation Management 20 (2003) 22–36. [34] J. Hertenstein, M.B. Platt, D. Brown, Valuing design: enhancing corporate performance through design effectiveness, Design Management Journal 12 (3) (2001) 10–19. [35] D. Hughes, T. Williams, Z. Ren, Differing perspectives on collaboration in construction (accepted), Construction Innovation: Information, Process, Management (2011). [36] F.M. Hull, A composite model of product development effectiveness: application to services, IEEE Transactions on Engineering Management 51 (2004) 162–172. [37] C.T. Hyun, K.M. Cho, K.J. Koo, A.M. Hong, H.S. Moon, Effect of delivery methods on design performance in multifamily housing projects, Journal of Construction Engineering and Management 134 (2008) 468–483. [38] C. Ivory, The cult of customer responsiveness: is design innovation the price of a client-focused construction industry? Construction Management and Economics 23 (8) (2005) 861–870. [39] J.M. Kamara, C.J. Anumba, The ‘voice of the client’ within a concurrent engineering design context, in: C.J. Anumba, J.M. Kamara (Eds.), Concurrent Engineering in Construction Projects, Taylor and Francis, Abingdon, 2007, pp. 57–79. [40] M. Kennerley, A. Neely, Measuring performance in a changing business environment, International Journal of Operations & Production Management 23 (2003) 213–229. [41] L.J. Koskela, P. Huovila, J. Leinonen, Design management in building construction: from theory to practice, Journal of Construction Research 3 (1) (2002) 1–16. [42] C.H. Loch, U.A.S. Tapper, Implementing a strategy-driven performance measurement system for an applied research group, Journal of Product Innovation Management 19 (2002) 185–198. [43] J. MacBryde, K. Mendibil, Designing performance measurement 558 systems for teams: theory and practice, Management Decision 41 (8) (2003) 722–733. [44] M.M. Montoya-Weiss, R. Calantone, Determinants of new product performance: a review and meta-analysis, Journal of Product Innovation Management 11 (1994) 397–417. [45] J. Moultrie, P.J. Clarkson, D.R. Probert, A tool to evaluate design performance in SMEs, International Journal of Productivity and Performance Measurement Special edition on Performance in Design and Manufacture 55 (3 & 4 b) (2006). [46] L. Nachum, Measurement of productivity of professional services an illustration on Swedish management consulting firms, International Journal of Operations & Production Management 19 (9) (1999) 922–949. [47] R.P. Nagarajan, S.J. Passey, P.L. Wong, M.C. Pritchard, G. Nagappan, Performance measures and metrics for collaborative design chain management, in: Proceedings of the 11th International Conference on Concurrent Enterprising, Munich, 2005. [48] E. Naveh, The effect of intergraded product development on efficiency and innovation, International Journal of Production Research 43 (13) (2005) 2789–2801. [49] A. Neely, M. Gregory, K. Platts, Performance measurement system design: a literature review and research agenda, International Journal of Operations & Production Management 25 (2005) 1228–1263. [50] R. Nellore, R. Balachandra, Factors influencing success in integrated product development projects, IEEE Transactions on Engineering Management 48 (2) (2001) 164–174.
23
[51] F.J. O'Donnell, A.H.B. Duffy, Modelling design development performance, International Journal of Operations & Production Management 22 (11) (2002) 1198–1221. [52] K.S. Pawar, H. Driva, Performance measurement for product design and development in a manufacturing environment, International Journal of Production Economics 60–61 (1999) 61–68. [53] Z. Ren, C.J. Anumba, Agent learning for multi-agent system, a case study for construction claims negotiation, Advanced Engineering Informatics 16 (2003) 265–275. [54] Z. Ren, C.J. Anumba, T.M. Hassan, G. Augenbroe, A functional architecture for e-Engineering Hub, Automation in Construction 17 (8) (2008) 930–940. [55] Z. Ren, F. Yang, N.M. Bouchlaghem, C.J. Anumba, Multi-disciplinary collaborative building design — a comparison between multi-agent system approach and multi-disciplinary optimisation Approach, Automation in Construction 20 (5) (2011) 491–660. [56] S. Salomo, J. Weise, H.G. Gemünden, NPD planning activities and innovation performance: the mediating role of process management and the moderating effect of product innovativeness, Product Innovation Management 24 (2007) 285–302. [57] A. Salter, R. Torbett, Innovation and performance in engineering design, Journal of Construction Management and Economics 21 (2003) 573–580. [58] S. Sandesten, M. Bergdahl, The Role and Mission of the Construction Client, Swedish Construction Clients Forum R&D and University Relations, Stockholm91-975824-0-9, 2006. [59] J.B. Schmidt, M.M. Montoya-Weiss, A.P. Massey, New product development decision-making effectiveness: comparing individuals face-to-face teams and virtual teams, Journal of Decision Sciences 32 (4) (2001) 575–600. [60] S.A. Sharif, B. Kayis, DSM as a knowledge capture tool in CODE environment, Journal of Intelligent Manufacturing 18 (2007) 497–504. [61] Q. Shen, H. Li, J. Chung, P. Hui, A framework for identification and representation of client requirements in the briefing process, Construction Management and Economics 22 (2004) 213–221. [62] W. Shen, Q. Hao, W. Li, Computer supported collaborative design: retrospective and perspective, Computers in Industry 59 (2008) 855–862. [63] J.W. Smither, Performance Appraisal: State of the Art in Practice, Jossey-Bass Inc., London, 1998. [64] L. Soibelman, C. Caldas, Information logistics for construction design team collaboration, in: Proceedings of the Eighth International Conference on Computing in Civil and Building Engineering, 2000. [65] E. Soltani, R.V.D. Meer, T.M. Williams, P. Lai, The compatibility of performance appraisal system with TQM principles— evidence from current practice, International Journal of Operations & Production Management 26 (1) (2006) 92–112. [66] M.M. Somerville, Z. Howard, Information in context: co-designing workplace structures and systems for organizational learning, Information Research 15 (4) (2010), (paper 446. URL: http://InformationR.net/ir/15-4/paper446.html). [67] M.V. Tatikonda, M.M. Montoya-Weisis, Integrating operations and marketing perspectives of product innovation: the influence of organizational process factors and capabilities on development performance, Journal of Management Science 47 (1) (2001) 151–172. [68] The Scottish Government, Section 6 — design quality in building, Procurement, Construction Procurement Manual0 7559 1260 8, 2005. [69] R. Torbett, A.J. Salter, D.M. Gann, M. Hobday, Design performance measurement in the construction sector, IEEE Transactions on Engineering Management (2001), (URL: http://www.sussex.ac.uk/Units/spru/publications/imprint/sewps/ sewp66/sewp66.pdf). [70] G. Tunstall, Managing the Building Design Process, Butterworth-Heinemann, UK, USA, 2006. [71] X.L. Xue, Z. Ren, Q.P. Shen, A critical review of collaborative working in construction projects: business environment and human behaviors, ASCE Journal of Management in Engineering 26 (2010) 196–208. [72] F. Yang, Pareto Genetic Algorithm based collaborative optimisation framework in building design, PhD Thesis, Loughborough University, UK, 2009. [73] Y.Y. Yin, S.F. Qin, R. Holland, Development of a design performance measurement matrix for improving collaborative design during a design process, International Journal of Productivity and Performance Management 60 (2) (2011) 152–184.