An industrial study on the importance of software component documentation: A system integratorʼs perspective

An industrial study on the importance of software component documentation: A system integratorʼs perspective

Information Processing Letters 111 (2011) 583–590 Contents lists available at ScienceDirect Information Processing Letters www.elsevier.com/locate/i...

195KB Sizes 2 Downloads 21 Views

Information Processing Letters 111 (2011) 583–590

Contents lists available at ScienceDirect

Information Processing Letters www.elsevier.com/locate/ipl

An industrial study on the importance of software component documentation: A system integrator’s perspective Sajjad Mahmood ∗ , Azhar Khan Information and Computer Science Department, King Fahd University of Petroleum and Minerals, Dhahran 31261, Saudi Arabia

a r t i c l e

i n f o

a b s t r a c t

Article history: Received 28 October 2010 Received in revised form 11 March 2011 Accepted 15 March 2011 Available online 22 March 2011 Communicated by J.L. Fiadeiro Keywords: Software engineering Component based systems Software components Integration Component documentation

Component integration is widely recognized as a process which plays a central role in overall Component Based System (CBS) development. A system integrator focuses on assembling existing components, developed by different parties, to build a software system. The integration process usually involves adapting existing component interfaces and writing new functions to handle the mismatches between stakeholder needs and available component features. The lack of detailed component documentation has been a key area of concern in CBS development due to its profound impact on the integration phase of a CBS development life cycle. In this paper, we report results of an industrial survey conducted among system integrators to understand role of component documentation in the CBS integration phase. The survey investigates whether the presence of component documentation helps a system integrator and its correlations with typical CBS integration success factors. The result reinforces current perceptions of the significance of component documentation in CBS integration. However, the lack of comprehensive component documentation presents a potential risk for a system integrator during integration effort estimation and testing processes. © 2011 Elsevier B.V. All rights reserved.

1. Introduction The extensive uses of software have placed new demands on the software industry to enhance development productivity and reduce associated costs [1]. These expectations have led software engineers to re-establish the idea of reuse and focus on moving the software industry away from developing each system from scratch [2]. CBS development emerged in the mid-90s with the introduction of software component technologies such as Microsoft COM and COBRA. Recently, the next generation of CBS development tools such as Microsoft .NET, Enterprise Java Beans, etc. are available in the market. In CBS development, a component is a fundamental building block for a software application. In this paper, we adopt Szyperski et al. [3] component definition: A software component is a unit of

*

Corresponding author. E-mail address: [email protected] (S. Mahmood).

0020-0190/$ – see front matter doi:10.1016/j.ipl.2011.03.012

© 2011

Elsevier B.V. All rights reserved.

composition with contractually specified interfaces and explicit context dependencies only. A software component can be deployed independently and is subject to composition by third parties. CBS development is integration centric with a focus on assembling pre-existing software components that are either developed in-house or purchased off-the-shelf, to build a software system. Recent research [4] suggests that a system integrator plays an important role in the success of a CBS by putting together pieces developed by different parties who are usually unaware of each other [5]. Furthermore, individual components are usually designed for general purposes that might not satisfy all customer requirements and some of them may be unnecessary in a given system. Thus, role of a system integrator becomes even more important because it is rarely the case that components are perfectly matched and system integration involves more than simply finding components, which together perform the desired tasks, and connecting their interfaces [2].

584

S. Mahmood, A. Khan / Information Processing Letters 111 (2011) 583–590

Component documentation plays an vital role in the success of a CBS as it is the main source of information which is used to balance the conflicting interests between what is needed and what is available [6]. Cechich and Piattini [5] highlight that the standards on component documentation needs to be reinforced as information available at component repositories are usually unstructured and presented in the form of marketing brochures and natural language description. We identify that Available Component Documentation (ACD)1 usually consists of a list of features, reviewer comments, price and, in some cases, trail versions. In addition to information about component interfaces, the integration process also needs information about component usage history, version control, test data and relevant quality attributes [7]. However, detailed component documentation is usually unavailable in the majority of component repositories. This lack of detailed component documentation introduces new challenges for system integrators as it increases ambiguity in the integration phase of a CBS. To the best of our knowledge, none of the existing work empirically investigates the impact of incomplete documentation for the integration phase of a CBS development life cycle. In this paper, we present an evaluation of the impact of available component documentation, from a system integrator’s perspective, on the overall success of a CBS integration process. The motivation of our work is to better understand how available component documentation is helping or hindering CBS practitioners in the integration process of a CBS. We analyze the relationship of available component documentation with five key integration success factors namely, Integration Effort Estimation (IEE), Early Component Evaluation (ECE), New Features and Modification Analysis (NFMA), Integration Testing (IT) and Document–Vendor Relationship (DVR). We focus only on these five integration success factors as they correspond to key information required by system integrators during the integration process of a CBS. The contribution of our work is the empirical study to show the impact of current available component documentation during the integration phase for a CBS. We also analyze the relationship between the available component documentation and integration success factors to better understand the current industrial practices regarding use of component documentation during the integration process of a CBS. The results indicate that system integrators in the industry found available component documentation useful for early evaluation of candidate components and learning new features about selected components. However, available component documentation is considered insufficient for integration effort estimation and performing integration and system testing of a CBS. Furthermore, our work reinforces current perceptions about the significance of component documentation and the need for detailed component documentation standards to use commercialoff-the-shelf components.

1 We performed an indicative study of popular component repositories, namely, component source – http://www.componentsource.com, component one – http://www.componentone.com and eclipse market place – http://www.eclipseplugincentral.net.

Fig. 1. CBS development life cycle.

2. Related work 2.1. Component-based systems overview Fig. 1 shows an overview of the CBS development life cycle. The CBS development approach can be divided into three main phases, namely, selection, integration and maintenance phases. The first phase, component selection, starts with a process to identify suitable components with a potential to match stakeholder demands. After this process, candidate components are analyzed to compare and select suitable components based on evaluation criteria. The second phase, component integration, focuses on adapting and assembling selected components through an architectural infrastructure. The third phase, component maintenance phase, handles continuous evolution of a CBS during its life cycle. 2.2. Component integration Component integration is the process of assembling components together. The selected components are integrated through well-defined infrastructure and this infrastructure provides the binding that forms a system from the disparate set of selected components [2,5]. The important issue when integrating components is to deal with the potential mismatches between components.

S. Mahmood, A. Khan / Information Processing Letters 111 (2011) 583–590

Dietrich et al. [8] have used active rules to design and generate wrappers to adapt components. The wrappers are automatically generated as enterprise java bean components and they act as proxy objects. These proxy objects intercept method calls and provide the functionality required by the overall component based system. Canal et al. [9] have presented a model based approach for component adaptation using notation based on synchronous vectors and transition systems for guiding adaption rules. Kim et al. [10] have proposed a generic process for CBS development where integration takes place at the release phase. Similarly, Chi [11] defines the signature view and behavior view of software components and uses a modeling method to design the component behavior into pi calculus expression. He also provides a tool to support composition of components in a CBS. These techniques help reduce CBS development process risks by considering component specifications and their interface properties, and increase flexibility by providing quantitative analysis of candidate components. For a detailed discussion of CBS development life cycle and integration techniques, please refer to our previous work [1]. 2.3. Integration risk analysis and empirical evaluation CBS development is a complex and risk-prone process which needs careful risk assessment on behalf of a system integrator, to help achieve potential benefits of reduced time to market, increased productivity and the development of a quality system [2]. Kotonya and Rashid [12] identify lack of source code and unknown design information; and disparity in component evolution cycles as the key risks for the integration phase of a CBS. An integration fault can be the result of incorrect understanding of a component or it may lie in one of the externally acquired components [13]. Rashid and Kotonya [7] highlight the importance of a good understanding of a component for integration and deployment. They argue that for successful integration, a component should have adequate documentation, usage history, version details and test reports. Similarly, Taulavuori et al. [14] have presented a standard documentation pattern for software components. The documentation pattern recommends component developers to describe the component’s general properties, a detailed description of component’s design and implementation, testing and quality information; and support information on the use and maintenance of the component. However, the proposed documentation pattern has not become a standard due to lack of commitment from the various component providers. In this paper, we empirically investigate the impact of lack of detailed component documentation in the integration phase of a CBS. Li et al. [15] presented an empirical study evaluating the variations in CBS development processes. Their results indicated that CBS have two key activities: ‘the build vs the buy decision’ and ‘component selection’. Li et al. [13] performed an industrial survey to analyze CBS common problems, most common risk reduction activities performed in the industry and how successful they were in avoiding these risks. They identify integration effort estimation and costly fault identification as two key challenges during CBS

585

Fig. 2. Research model.

development. Recently Li et al. [4], Land et al. [16] and Ayala et al. [17] have observed a number of differences between academic theories and industrial practices in different areas of a CBS development life cycle ranging form adaptation of traditional development processes to a lack of use of formal component selection methods. To date, empirical research work on CBS has been focused on identifying risks associated with component identification and selection processes. However, to the best of our knowledge, none of the existing work investigates the impact of available component documentation for the integration phase, from the system integrator’s perspective, of the CBS development life cycle. 3. Research questions and hypotheses The objective of our study is to assess whether available component documentation is sufficient for integration process of a CBS development life cycle. We are also interested in investigating how available component documentation helps system analysts in component integration for a CBS. In order to perform such analysis, we also need to understand the relationship between key integration success factors [2,18,13,4] and available component documentation. After an indicative literature survey of CBS integration techniques, we identify integration effort estimation, early component evaluation, learning new and modified component features, integration testing and developer–vendor relationship as integration success factors. The research model for the study is shown in Fig. 2. The hypotheses for assessing the effect of available component documentation on the key integration success factors are as follows: HN1: The ‘available component documentation’ does not help in ‘integration effort estimation’. HA1: The ‘available component documentation’ helps in ‘integration effort estimation’. HN2: The ‘available component documentation’ does not help in ‘early component evaluation’.

586

S. Mahmood, A. Khan / Information Processing Letters 111 (2011) 583–590

Table 2 Hypothesis testing using Pearson correlation coefficient.

Table 1 Descriptive data statistics. Variable

Min

Mean

Max

IEE ECE NFMA IT DVR

2 2 2 1 2

2.8 2.3 2.4 3.8 3.9

5 5 5 5 5

HA2: The ‘available component documentation’ helps in ‘early component evaluation’. HN3: The ‘available component documentation’ does not help in ‘new feature and modification analysis’. HA3: The ‘available component documentation’ helps in ‘new feature and modification analysis’. HN4: The ‘available component documentation’ does not help in ‘integration testing’. HA4: The ‘available component documentation’ helps in ‘integration testing’. HN5: The ‘available component documentation’ does not help in maintaining ‘developer–vendor relationship’. HA5: The ‘available component documentation’ helps in maintaining ‘developer–vendor relationship’. 4. Research model 4.1. Data collection Software developers with experience of using software components (both in-house and off-the-shelf components) for more than three years were the target participants for our study. The survey was performed using a variant of snowball sampling technique [19] where key practitioners in organizations serve as contact points for the study. The contact points were emailed the link for the web-based survey, which they can forward on to other potential respondents within their organizations. The contact points also reported the total number of respondents from each organization and functioned as a temporary checkpoint for the number of completed questionnaires. The participants belong to small to medium-sized companies from Australia, Pakistan, Saudi Arabia, United Arab Emirates and the United Kingdom. These companies provide a wide range of services such as software consultancy and off-shore software development. All the participants have either an undergraduate or a graduate degree in computer science or related fields. Furthermore, the participants’ role in the organizations ranged from software developers to software architects with 5–7 years of experience in CBS. In total, 110 participants were contacted and 53 participants completed the survey. We collected data on the five key integration success factors identified in the research model shown in Fig. 2. The study questionnaire was used to assess the extent to which the available component documentation helps study participants during CBS development. In the survey, we defined the ‘available component documentation’ as (1) list of features, (2) reviewer rating, (3) price, and (4) availability of trail versions. The survey consists of three parts and fourteen questions. The first part collected data about the background of the participant, organization and experience

H1 H2 H3 H4 H5

V1

V2

PC

p-value

ACD ACD ACD ACD ACD

IEE ECE NFMA IT DVR

−0.35 0.45 0.41 −0.20 0.21

0.06 0.01 0.02 0.27 0.29

with CBS development. In the second part, the participants were asked to judge the usefulness of ‘available component documentation’ using a five point scale that ranged from ‘strongly agree’ to ‘strongly disagree’ for each integration success factor. The third part provided participants an opportunity to share their experience of CBS development; and discuss the strengths and challenges of component integration during a CBS development life cycle. 4.2. Data analysis A number of statistics analysis techniques are used to analyze the data collected during the study to validate each of the hypotheses defined in Section 3. Table 1 shows the descriptive data statistics of the survey. We conducted data analysis for hypotheses using Pearson Correlation (PC) coefficient and Spearman Rank Order Correlation (SC) coefficient. PC assumes normality in variables and is sensitive to outliers. On the other hand, SC is less sensitive to bias due to outliners and does not require normal distribution of the data. In this paper, we used both PC and SC to ensure the reliability of the results. Furthermore, we also used Partial Least Square (PLS) technique to predict a set of dependent variables from a set of independent variables. Furthermore, other studies [20,21] have shown that PLS helps avoid the issue of small to medium data sets and provides more reliable results. Hence, we believe that using these different data analysis techniques provides an extra level of confidence in interpreting the results. The statistical calculations were performed using Minitab 14 software. 5. Results analysis 5.1. Hypothesis testing phase-I In this section, we apply the Pearson correlation coefficient [22] on the variables of the research model shown in Fig. 2. Table 2 shows the results of the Pearson correlation coefficient analysis. The Pearson correlation p-value between ‘available component documentation’ and ‘integration effort estimation’ was 0.06 at P < 0.05. Thus, we accept null hypothesis HN1 and conclude that there is no relationship between availability of component documentation and integration effort estimation of a CBS. The Pearson correlation p-value between ‘available component documentation’ and ‘early component evaluation’ indicates that the difference is statistically significant (p-value = 0.01). We reject null hypothesis HN2 and conclude that component documentation helps in the early evaluation of components.

S. Mahmood, A. Khan / Information Processing Letters 111 (2011) 583–590

Table 3 Hypothesis testing using Spearman rank order correlation coefficient.

H1 H2 H3 H4 H5

V1

V2

ACD ACD ACD ACD ACD

IEE ECE NFMA IT DVR

Rs

−0.33 0.41 0.42 −0.11 0.21

587

Table 4 Hypotheses testing using partial least square regression.

1 Tailed

2 Tailed

Variables

0.05 0.01 0.01 0.28 0.14

0.069 0.019 0.016 0.553 0.277

Variable 1

Variable 2

F

P

RSq

ACD ACD ACD ACD ACD

IEE ECE NFMA IT DVR

4.16 7.57 6.00 1.27 1.18

0.06 0.01 0.02 0.27 0.27

0.123 0.202 0.167 0.041 0.042

The p-value 0.02 at P < 0.05 was observed between ‘available component documentation’ and ‘new feature and modification analysis’ which means that null hypothesis HN3 was rejected and alternate hypothesis HA3 was accepted. Alternate hypothesis HA4 was accepted based on Pearson coefficient p-value 0.27 at p < 0.05 between ‘available component documentation’ and ‘integration testing’. The Pearson coefficient p-value between ‘available component documentation’ and ‘developer–vendor’ was 0.29 at P < 0.05 thus providing a justification to accept null hypothesis HN5 and reject alternate hypothesis HA5. In summary, we observed that the null hypothesis for H1, H4, H5 are accepted and the alternate hypotheses for H2, H3 are found to be significant. 5.2. Hypothesis testing phase-II To assess the usefulness of component document and its impact on CBS integration factors, we also conducted non-parametric statistical analysis using Spearman correlation coefficient [22]. The Spearman correlation coefficient data is shown in Table 3. The Spearman correlation 1-tailed value between ‘available component documentation’ and ‘integration effort estimation’ is 0.05 at P < 0.05. Thus, we accept null hypothesis HN1 and reject alternate hypothesis HA1. The alternate hypothesis HA2 is accepted based on Spearman correlation 1-tailed 0.01 at P < 0.05 between ‘available component documentation’ and ‘early component evaluation’. It indicates that component documentation helps in the early evaluation of components. The 1-tailed 0.01 at P < 0.05 was observed between ‘available component documentation’ and ‘new feature and modification analysis’ which means that null hypothesis HN3 was rejected and alternate hypothesis HA3 was accepted. Thus, we conclude that component documentation has an association with learning new features and evolving components in a CBS. Similarly, alternate hypothesis HA4 was accepted based on Spearman coefficient 1-tailed 0.28 at p < 0.05 between ‘available component documentation’ and ‘integration testing’. The Spearman coefficient 1-tailed between ‘available component documentation’ and ‘developer–vendor relationship’ was 0.14 at P < 0.05 thus providing a justification to accept null hypothesis HN5 and reject the alternate hypothesis HA5. In summary, we observed that the null hypotheses for H1, H4, H5 are accepted but their alternate hypotheses are rejected, whereas the alternate for hypotheses H2, H3 are found significant and accepted but their null hypotheses are rejected.

H1 H2 H3 H4 H5

PLS

Table 5 Comparison of results from different results.

H1 H2 H3 H4 H5

PC

SC

p-value

1-T

2-T

F

PLS P

RSq

0.06 0.01 0.02 0.27 0.29

0.05 0.01 0.01 0.28 0.14

0.069 0.019 0.016 0.553 0.277

4.16 7.57 6.00 1.27 1.18

0.06 0.01 0.02 0.27 0.27

0.123 0.202 0.167 0.041 0.042

5.3. Hypothesis testing phase-III In phase-III of hypothesis testing, we used the Partial least square regression (PLS) [22] technique to analyze the results observed in phase-I and phase-II. Table 4 shows the results of structural tests of the hypotheses with values of F-ratio, path coefficient and R-square. Table 5 shows the cross validation of results. ‘Available component documentation’ is used as a response variable and the rest of the factors are used as a predicate in PLS testing of H1–H5. ACD has a statistically significant positive relationship with ECE based the path coefficient 0.01 with f-ratio 7.57 and r-sq 0.202. The path coefficient 0.02 with f-ratio 6.00, r-sq 0.167 was observed for NFMA which means a direct, statistically significant, positive relationship with ACD. The path coefficient 0.06, 0.27 and 0.27 were observed for IEE, IT and DVR respectively. This implies that IEE, IT and DVR are also relevant to ACD but it does not provide detailed information to facilitate integration effort estimation, help in integration testing and maintain the desired developer–vendor relationship. 5.4. Qualitative analysis In this section, we summarize the strengths and challenges shared by participants of the survey. The experience of participants was collected as part of three open-ended discussion questions, namely, motivation of using components, usefulness of ACD and challenges during the integration phase of a CBS. The participants indicated two main factors for using software components: (i) the development team does not have the skills to implement certain required features and they prefer to use a component which comes with support; and (ii) use components that have been successfully used in previously projects and helps in making savings in terms of time and development effort. The feedback from the participants also indicated the usefulness of available component documentation for early evaluation and understanding the changes made in components from one version to another.

588

S. Mahmood, A. Khan / Information Processing Letters 111 (2011) 583–590

The system integrators participating in the study indicate that available component documentation does not provide enough technical details of components and integration efforts estimates are done based on individual experiences and trails of the candidate components. The feedback from the subjects also indicated the need for comprehensive documentation, as it will help overcome glue-code and testing challenges later in the CBS development life cycle. Lack of detailed design diagrams introduces challenges in adapting a component that does not completely satisfy stakeholder needs. It also makes it difficult to analyze architectural and design dependences introduced by individual components. Furthermore, debugging a CBS becomes difficult due to lack of access to component code and inadequate interface description. We believe these concerns increase the need for standards and guidelines for documenting software components. 6. Discussion From the industrial survey results, we make the following useful observations.

• Integration effort estimation: The correct integration effort estimation is crucial for delivering a CBS on time and within the allocated budget. The result analysis shows that there is no correlation between available component documentation and integration effort estimations of a CBS. Further, we believe that component documentation also needs to include data about component evolution and details of the changes. • Early component evaluation: The success of CBS [23,24] depends on the ability to select suitable components. Inappropriate component selection can lead to adverse effects, such as introducing extra cost, in integration and maintenance phases [23]. It is evident from our study that available component documentation helps in early evaluation of candidate components. The information about component features, version history and price helps a system integrator to analyze candidate components against system requirements and architectural constraints of the CBS-to-be. • Learning new features: As shown in our study, there is a correlation between available component documentation; and learning new and updated feature details in new versions of components. This is due to the fact that component vendors are usually good at advertising new features and modifications in the new versions of their respected components. This helps system integrators in assessing the changes needed at the architectural and integration code levels and take necessary steps to overcome compatibility risks. • Integration testing: Our study shows that there is no correlation between available component documentation and integration testing which indicates that null hypothesis HN4 is accepted. A component usually goes through traditional software testing at the developer’s site. However, the details of these individual component tests are rarely made available to the system integrator. Furthermore, the heterogeneous nature of components and deployment architectures introduce

complexities in the integration phase of a CBS. Thus, there is a need to provide individual component testing details to assist the integration testing process of a CBS. • Developer–vendor relationship: The study indicates that there is no correlation between available component documentation and maintaining developer–vendor relationships. This indicates that some of the key factors for system integrators such as details of technical support provided by vendors, customer reviews, component volatility and personal contact details are usually lacking in the majority of current component documentation. • Suggestions: Fundamental to a component is its interface, which defines the services provided by a component, and acts as a basis for its use and implementation. It is one of the primary sources for understanding and often can be the only source available. The system integrator should pay a special attention in acquiring detailed interface information including method signatures to facilitate component adaption and integration during development of a CBS. Components should be selected from certified repositories. Furthermore, system integrators should pay special attention in establishing a good developer–vendor relationship. This might help in gaining access to individual component test cases, which can also be used as functional specifications to help a system integrator to better understand the component features as compared to the natural language descriptions. • Usefulness of results: The results provide empirical evidence based on industrial feedback to reinforce the general perceptions about the importance of individual component documentation. It also highlights that the software industry has not yet adopted the proposed component documentation standards (for example, [14]) and practitioners have to rely on their experience to evaluate and integrate components during a CBS development life cycle. Furthermore, the results provide an important insight to the component documentation based risk factors. In future, we plan to use these results to develop a maturity model for integration process assessment of a CBS. 7. Threats to validity 7.1. Construct validity We believe that our study has no serious construct validity threats because the research questions are based on the existing literature [12,7,1,13,4]. To ensure construct validity, the survey questionnaire was pre-tested internally with five colleagues to ensure that all questions are meaningful and their respective answers will help us in the result analysis. All terminologies used in the questionnaire are explained at the start of the survey to provide clear definitions and avoid any misinterpretations. Furthermore, time pressure is another threat to validity. We believe that the time allocated for the survey was sufficient as the subjects answered all the questions and no one complained about the lack of time in our communications.

S. Mahmood, A. Khan / Information Processing Letters 111 (2011) 583–590

Thus, we believe that time is not a confounding factor in our study. 7.2. Internal validity One possible internal validity threat for survey-based studies is a possible lack of experience and motivation to answer the questions. There is no serious internal validity threat in terms of internal validity because all the subjects voluntarily participated in the study; had either an undergraduate or a graduate degree in computer science or a related field; and 5–7 years of industrial experience of integrating components. Another possible threat to internal validity is possible ambiguity in the questions. To avoid these misunderstandings, we were available via email during the study to clarify any ambiguities in the questions. 7.3. Conclusion validity In this study, we used standard statistical techniques to either accept or reject the null hypotheses. We also mentioned the detailed output of the statistical techniques in Tables 2, 3, 4 and 5. We used both parametric and nonparametric statistical techniques to validate our results. The Partial Least Square technique was used to analyze the results and avoid the issue of small to medium data set size. Furthermore, the survey was designed using standard scales that allow us to analyze feedback provided by subjects with 5 to 7 years of experience in software development. 7.4. External validity The inherent limitation of survey-based studies lies in their external validity due to difficulty in achieving a true random sample of participants. We overcome the external validity threats in the study by using a variant of the snowball sampling technique [19] where key practitioners serve as contact points in the organizations involved. The contact points are then sent the questionnaire, and forward it on to other potential respondents. Another possible external threat is the location and size of the respondents company. Hence, it is not easy to generalize these results for all domains. In our study, the participants belong to Asian, European and Australian small to medium-sized companies and these companies work in different areas of IT industry. Furthermore, we ensured that all the potential participants had relevant experience in development of a CBS. We believe that the results of the study can be generalized for small to medium-sized companies where software developers are involved in component integration. 8. Conclusions and future works This paper presents the results of an industrial survey. The results indicate that available component documentation does help in integrating selected components. The participants of the survey found the ACD was useful in early component evaluation and discovering new features. However, it is important to note that on average available

589

component documentation does not provide enough information to overcome the two most common CBS integration challenges of incorrect integration effort estimation and integration testing. For future work, there is a need to conduct further studies to understand the impact of other integration factors such as the importance of component design and quality attribute information during the integration process of a CBS. We plan to develop an integration process maturity model for integration assessment of a CBS with an aim to provide a set of guidelines to avoid common CBS integration risk factors. Acknowledgements The authors would like to thank King Fahd University of Petroleum and Minerals, Dhahran, Saudi Arabia for its continuous support of research. We thank the anonymous reviewers for their insightful suggestions, which significantly contributed to improving the quality of the paper. References [1] S. Mahmood, R. Lai, Y.S. Kim, Survey of component based software development, IET Software 1 (2007) 57–66. [2] I. Crnkovic, M. Larsson, Challenges of component based development, Journal of Systems and Software 61 (2002) 201–212. [3] C. Szyperski, D. Gruntz, S. Murer, Component Software – Beyond Object – Oriented Programming, second edition, Addison–Wesley, 2002. [4] J. Li, R. Conradi, O.P.N. Siyngstad, C. Bunse, M. Torchiano, M. Morisio, Development with off-the-shelf components: 10 facts, IEEE Software 26 (2009) 80–87. [5] A. Cechich, M. Piattini, Early detection of cots component functional suitability, Information and Software Technology 49 (2007) 108–121. [6] C. Alves, Cots based requirements engineering, in: Component Based Software Quality, in: LNCS, vol. 2693, 2003, pp. 21–39. [7] A. Rashid, G. Kotonya, Risk management in component-based development: By separation of concerns, in: ECOOP Workshop on Advanced Separation of Concerns. [8] S.W. Dietrich, R. Patil, A. Sundermier, S.D. Urban, Component adaptation for event-based application integration using active rules, Journal of Systems and Software 79 (2006) 1725–1734. [9] C. Canal, P. Poizat, G. Salaun, Model-based adaptation of behavioral mismatching components, IEEE Transactions on Software Engineering 34 (2008) 546–563. [10] S. Kim, S. Park, J. Yun, Y. Lee, Automated continuous integration of component-based software: An industrial experience, in: Proceedings of 23rd IEEE/ACM International Conference on Automated Software Engineering, 2008, pp. 423–426. [11] Z. Chi, Components composition compatibility checking based on behaviour description and roles division, in: IEEE International Conference on Granular Computing, 2009, pp. 757–760. [12] G. Kotonya, A. Rashid, A strategy for managing risk in componentbased software development, in: 27th Euromicro Conference: A Net Odyssey, 2001, pp. 12–21. [13] J. Li, R. Conradi, O.P. Slyngstad, M. Torchiano, M. Morisio, C. Bunse, A state-of-the-practice survey of risk management in development with off-the-shelf software components, IEEE Transactions on Software Engineering 34 (2008) 271–286. [14] A. Taulavuori, E. Niemela, P. Kallio, Component documentation a key issue in software product lines, Information and Software Technology 46 (2004) 535–546. [15] J. Li, F.O. Bjornson, R. Conradi, V.B. Kampenes, An empirical study of variations in cots-based software development processes in the Norwegian IT industry, Empirical Software Engineering 11 (2006) 433– 461. [16] R. Land, D. Sundmark, F. Luders, I. Krasteva, A. Causevic, Reuse with software components – a survey of industrial state of practice, in: LNCS, vol. 5791, 2009, pp. 150–159.

590

S. Mahmood, A. Khan / Information Processing Letters 111 (2011) 583–590

[17] C. Ayala, O. Hauge, R. Conradi, X. Franch, J. Li, Selection of third party software in off-the-shelf software development – an interview study with industrial practitioners, Journal of Systems and Software 84 (2011) 620–637. [18] P. Vitharana, Risks and challenges of component based software development, Communications of the ACM 46 (2003) 67–72. [19] S.L. Pfleeger, B.A. Kitchenham, Principles of survey research: Parts 1– 6, in: ACM SIGSOFT Software Engineering Notes, 2001. [20] E.T. Wang, S.-P. Shih, J.J. Jiang, G. Klein, The relative influence of management control and user-is personnel interaction on project performance, Information and Software Technology 48 (2006) 214–220.

[21] F. Ahmed, L.F. Capretz, The software product line architecture: An empirical investigation of key process activities, Information and Software Technology 50 (2008) 1098–1113. [22] W. Mendenhall, T. Sincich, Statistics for Engineering and the Sciences, Pearson Education, 2007. [23] K.R. Leung, H.K. Leung, On the efficiency of domain based cots product selection method, Information and Software Technology 44 (2002) 703–715. [24] C. Alves, A. Finkelstein, Investigating conflicts in cots decisionmaking, International Journal of Software Engineering and Knowledge Engineering 13 (2003) 1–21.