The Journal of Systems and Software 82 (2009) 563–570
Contents lists available at ScienceDirect
The Journal of Systems and Software journal homepage: www.elsevier.com/locate/jss
Engaging the net generation with evidence-based software engineering through a community-driven web database David S. Janzen a,1, Jungwoo Ryoo b,* a b
California Polytechnic State University, San Luis Obispo, CA, USA The Pennsylvania State University-Altoona, Altoona, PA, USA
a r t i c l e
i n f o
Article history: Received 2 August 2008 Received in revised form 2 December 2008 Accepted 27 December 2008 Available online 16 January 2009 Keywords: Evidence-based software engineering Empirical software engineering Software engineering education
a b s t r a c t Software engineering faculty face the challenge of educating future researchers and industry practitioners regarding the generation of empirical software engineering studies and their use in evidence-based software engineering. In order to engage the Net generation with this topic, we propose development and population of a community-driven Web database containing summaries of empirical software engineering studies. We also present our experience with integrating these activities into a graduate software engineering course. These efforts resulted in the creation of ‘‘SEEDS: Software Engineering Evidence Database System”. Graduate students initially populated SEEDS with 216 summaries of empirical software engineering studies. The summaries were randomly sampled and reviewed by industry professionals who found the student-written summaries to be at least as useful as professional-written summaries. In fact, 30% more of the respondents found the student-written summaries to be ‘‘very useful”. Motivations, student and instructor-developed prototypes, and assessments of the resulting artifacts will be discussed. Ó 2009 Elsevier Inc. All rights reserved.
1. Introduction Software-oriented practitioners, educators, and researchers have complementary but distinct goals that might be summarized as follows: Practitioners’ Goal: Apply the most efficient (fastest or least costly) method/tool to produce, maintain, and evolve software that satisfies requirements with the fewest defects, the least development cost, the highest performance, and the best maintainability/reusability. Educators’ Goal: Apply the most effective method/tool to convert novice freshmen into industry-ready professionals who can achieve Practitioners’ Goal. Researchers’ Goal: Discover/innovate methods and tools for meeting both Practitioners’ and Educators’ Goals, and demonstrate their efficacy. In other words, ‘‘Prove it!” Evidence-based software engineering (EBSE) has much to contribute toward the goals of all three constituents. We present an experience report from a graduate software engineering course that incorporated EBSE topics and produced a community-driven Web database of study summaries that may benefit students, edu* Corresponding author. E-mail addresses:
[email protected] (D.S. Janzen),
[email protected] (J. Ryoo). 1 Principal corresponding author. 0164-1212/$ - see front matter Ó 2009 Elsevier Inc. All rights reserved. doi:10.1016/j.jss.2008.12.047
cators, and practitioners. We begin with a brief introduction to EBSE followed by a description of the proposed EBSE database that we have named SEEDS. The database is used as a central repository for community members to share findings on specific software engineering approaches of their choice according to the EBSE principles. Next we present the pedagogical approach taken in this course. We describe the resulting artifacts and corresponding evaluation activities.
2. Introduction to ESE and EBSE Software developers are known for adopting new technologies and practices based solely on their novelty, promise, or anecdotal evidence. Empirical software engineering (ESE), on the other hand, endeavors to produce a body of documented experiences that might help software practice adoption decisions. The goal of evidence-based software engineering (EBSE) is ‘‘to provide the means by which current best evidence from research can be integrated with practical experience and human values in the decision making process regarding the development and maintenance of software” (Kitchenham et al., 2004). This ‘‘best evidence from research” results from experimentation. Experimentation in software engineering involves the empirical study of human activities (Basili and Zelkowitz, 2007) to aid decisions on what are the best practices, processes, methods or tools for developing software, and when (in what context) they should be applied.
564
D.S. Janzen, J. Ryoo / The Journal of Systems and Software 82 (2009) 563–570
Experimental validation in software engineering ranges from informal assertions (‘‘I used X and it worked great!”) to formal theoretical analyses. Commonly recognized experimentation involves observational studies (e.g. case studies) and controlled experiments, both of which can be conducted in the field with professional software developers or in labs, typically with students (Zelkowitz et al., 2003). Although studies with students typically suffer from several threats to validity, Carver et al. (2003) note a number of advantages in student-based studies.
2.1. Growing interest in ESE Interest in empirical software engineering appears to be increasing, beginning perhaps in the mid-1990s (Sjoberg et al., 2007). In 1994, Gibbs wrote that ‘‘after 25 years of disappointment with apparent innovations that turned out to be irreproducible or unscalable, many researchers concede that computer science needs an experimental branch to separate the general results from the accidental” (Wayt Gibbs, 1994). Zelkowitz and Wallace observed that almost half of the papers in IEEE Software, IEEE Transactions on Software Engineering, and ICSE conferences (from 1985, 1990, and 1995) included only assertions or no evaluation at all (Zelkowitz and Wallace, 1997). More recently, however, many conferences and journals such as IEEE Transactions on Software Engineering specifically request empirical studies and experimental validation of new ideas. Springer publishes a dedicated journal entitled Empirical Software Engineering: An International Journal. Reporting guidelines have been proposed and compared (Kitchenham et al., 2006; Jedlitschka and Pfahl, 2005). Empirical software engineering projects have received significant government and corporate funding. Research centers have been founded such as the ‘‘NSF Center for Empirically-Based Software Engineering” (www.cebase.org) which was active from 2000 to 2004.
2.2. ESE and EBSE challenges and opportunities Despite the increased interest, the growth of empirical software engineering research is somewhat slow (Glass et al., 2004) and difficult. Many factors contribute to the challenges of empirical software engineering. First, we suggest that various barriers limit access to conducting such studies. While laboratory experiments are often conducted in academic environments, there are many inherent threats to validity. Students are rarely as mature as professional software developers. Application domains are often contrived. Software projects are rarely as large and complex as ‘‘realworld” projects. Unfortunately companies and organizations are often reluctant to participate in field experiments. This seems to be particularly true in the United States. Many may be unwilling to try new, perhaps unproven approaches. Others may be concerned that they might reveal poor metrics or performance. Or they may be unwilling to allow researchers in for fear of losing proprietary information or simply that they may slow down the team. Second, we believe that professional software practitioners struggle to acquire, analyze, and apply empirical software engineering results. Empirical studies are commonly reported in academic journals and conference proceedings. Many practitioners rarely read these or even have access to them. Furthermore empirical studies can be difficult to find among the mix of other papers. Even when they are found, most practitioners lack any formal education on how to analyze the studies. We suggest that education on ESE and EBSE, along with easy access to empirical studies may address some of the challenges of
EBSE. In the next sections we propose a system and a pedagogical approach that take advantage of these opportunities.
3. SEEDS: software engineering evidence database system We propose a Web-based, community-driven database for collecting, surveying, analyzing, and disseminating empirical results for use in EBSE. This database would serve as a comprehensive, constantly evolving focal point for summarizing and analyzing ESE results. This section presents a vision for such a system. An early prototype has been created, which implements some of the features described here. This prototype is tentatively named ‘‘SEEDS: Software Engineering Evidence Database System” and is available at http://www.evidencebasedse.com. SEEDS would be designed and constructed as a kind of structured Wiki. SEEDS would be organized by topic. Each topic would include ‘‘how-to” references that would summarize the purpose and mechanism for applying the practice/tool/method. Each topic would include a listing of all related empirical software engineering studies. The study summaries would conform to a common format for reporting design, context, and result summaries. As an added feature, we propose that the system include userdriven comparison grids that compare a set of studies on a related topic using a set of dynamic attributes. Such grids are inspired by those presented in two test-driven development summary articles (Janzen and Saiedian, 2005; Jeffries and Melnik, 2007). SEEDS would be populated in a decentralized fashion by the community of software engineers. We propose to ‘‘seed” SEEDS with study summaries created by graduate software engineering students and reviewed by software engineering faculty. By establishing a ‘‘critical mass” of content, the goal is that the software engineering community will embrace and grow SEEDS. Researchers would be able to add studies with links to official publications. Community members (e.g. industry professionals, graduate students, researchers) would be encouraged to rate studies in terms of quality, validity, and importance of results. The rating system would be designed to ensure that the most reputable and pertinent studies ‘‘bubble up” if they become the most widely read by industry practitioners. Community members would also be able to write summaries of the empirical study reports. The summaries would accomplish the goals of not only condensing the information to the most salient points, but also providing a critical analysis. Multiple study summaries would be allowed for each study report. These summaries could also be rated, with the most useful/ accurate summaries also ‘‘bubbling up” for promoting improved content quality. We imagine summary comments to address issues such as threats to validity, or praises for particularly well executed studies. As a community-driven application, we envision the software engineering community monitoring and improving SEEDS in much the same fashion as Wikipedia (http://www. wikipedia.org). An email link will be provided for reporting inappropriate content. The database would provide a single point of reference for researchers and practitioners. We believe SEEDS will be viewed as a ‘‘one-stop shop” that should enable industry professionals to quickly find useful information. Unlike traditional survey publications, the database promises to be constantly updated, rather than providing a snapshot of results at a single point in time. We intend to disseminate SEEDS’ existence through publications, conferences, and human networks. Traffic to the database would then be tracked and reported, highlighting trends and interest by the community. We propose SEEDS as an asynchronous, global portal for increasing EBSE education and activities through simple access to comprehensive, empirical data and community participation in
D.S. Janzen, J. Ryoo / The Journal of Systems and Software 82 (2009) 563–570
study analyses. We believe the community-driven nature of SEEDS will be embraced by the young software engineers who have grown up with the Internet, social networking, and Wikipedia. By improving visibility and access, this approach may help satisfy the need for more new and replicated studies (Basili et al., 1999), as well as satisfy an ethical duty of software professionals to assist colleagues and further develop the field. 3.1. Related EBSE databases The NSF-funded Center for Empirically-Based Software Engineering (CeBASE) provided a forum for researchers and a subset of practitioners who are participating in empirical studies to share data and results. However, CeBASE does not provide comprehensive summary results for the general software engineering community and CeBASE appears to now be inactive. The Empirical Research Repository (ERR) (Empirical research repository) is hosted by Durham University and is assessed in Section 5. The ERR appears to have similar goals to SEEDS. Differences include the fact that SEEDS is community-driven whereas ERR appears to have only a select set of contributors. ERR has also established strict EBSE study selection criteria for inclusion in the repository. 4. Pedagogical approach Given the increased interest in EBSE, software engineering faculty must find effective ways to integrate EBSE topics into their curriculum. Software engineering students who intend to conduct research will find it necessary to design and perform experimental validations. Students who proceed primarily to professional practice will need to be able to find, interpret, and analyze EBSE reports in order to make informed adoption decisions. Education on EBSE is proposed as a possible strategy to improve EBSE awareness, access, and analysis. An increased awareness should reduce entry barriers for conducting field experiments. Improved access should encourage practitioner participation and use of EBSE results. Enhanced analysis should also improve practitioner understanding and application of EBSE results. 4.1. Related courses In 2003, (Jorgensen et al., 2005) took the approach of developing an entire course dedicated to empirical software engineering at Hedmark University College in Rena, Norway. Their course involved teaching modules on empirical software engineering background, theory, argumentation, and prerequisite statistics. Students then completed a course project that involved selection of an empirical software engineering topic, relevant background research, and a significant report ‘‘that marshals the available evidence to support a conclusion.” Steve Easterbrook notes nine university courses focused on empirical software engineering (Steve Easterbrook) including his own. All but one of these courses appear to be at the graduate level, and nearly all are completely devoted to the topic of empirical software engineering. We have proposed mechanisms for incorporating empirical software engineering information in professional training (Janzen et al., 2007). This approach is, however, slow and localized.
undergraduate institution located on the central coast of California, about half way between San Jose and Los Angeles. The computer science department at Cal Poly offers bachelor of science and master of science degrees in computer science, and a bachelor of science degree in software engineering. Cal Poly’s motto is ‘‘Learn by Doing.” In order to apply the ‘‘Learn by Doing” approach to EBSE education, the lead author incorporated the following assignments into a fall 2007 graduate software engineering course: 1. Students will work alone or in pairs to write surveys of empirical studies on a particular software engineering topic. The surveys will be added to a common database. Students are expected to contribute a minimum of seventeen study reviews per person, along with one ‘‘how-to” summary of the topic being surveyed. 2. Students will participate in a team to develop, document, and present the requirements, architecture, and prototypes for the SEEDS system described earlier. The selected course is the first of two software engineering courses in a computer science masters program. The first course typically focuses on requirements and architecture topics, while the second course typically focuses on design, construction, and maintenance topics. The pedagogical goals of these assignments were as follows: Increase student awareness and competence in evaluating empirical software engineering studies. Integrate EBSE materials into existing graduate software engineering courses at Cal Poly. Engage Net generation students through a collaborative/community-driven Web model. Create a repository of useful EBSE information for widespread use. Extend the life of student work beyond the limits of a 10-week course. SEEDS satisfies all these goals as illustrated in Fig. 1. More specifically, (1) SEEDS raises EBSE awareness and competence among students by exposing them to empirical software engineering literature. (2) It also facilitates the integration of EBSE into the existing curriculum by serving as a reusable term project framework (or just as a demonstration piece) that is self-contained and easily incorporable. (3) In addition to playing the role of a learning tool, SEEDS creates a publicly accessible repository of EBSE information, which can benefit the EBSE community in general. (4) Finally, it
4.2. Learn by doing approach to EBSE We desired to instill an understanding and appreciation for EBSE topics, while staying within the confines of the two existing graduate software engineering courses at California Polytechnic State University, San Luis Obispo (Cal Poly). Cal Poly is a primarily
565
Fig. 1. Correspondence between course objectives and SEEDS features.
566
D.S. Janzen, J. Ryoo / The Journal of Systems and Software 82 (2009) 563–570
especially appeals to the net generation of students through its many Web 2.0-like features. An alternative hands-on approach would have been to have students plan and/or participate in an empirical software engineering study. Such an activity typically occurs in a year-long senior capstone project for the undergraduate software engineering major at Cal Poly. We elected not to duplicate that experience in the graduate course, but opted rather for assignments that require more analytical thinking on the part of the students. 5. Results Students completed the collection and analysis of their empirical software engineering studies by week eight of the 10-week quarter. Prior to writing their summaries, students participated in seminar style meetings in which empirical software engineering studies were collectively read, discussed, and analyzed. Because the student SEEDS projects were under development, we implemented a rapid prototype using the Drupal (http://drupal.org) content management system to house the student summaries. The prototype is missing many of the features of the student projects, but it provided a simple mechanism to create EBSE topic areas, and allowed community members (in this case students) to register and contribute summaries of studies. This prototype is available at http://www.evidencebasedse.com.
Fig. 3. Study view of the SEEDS prototype.
5.1. Instructor artifacts In this section, we describe in more detail the SEEDS prototype created by the instructor. As shown in Fig. 2, the side menu bar of the prototype lists topic areas available in the SEEDS repository. The numbers next to each topic indicate the counts of posted articles under a specific subject. Users can click on any topic of their interest and are then led to the next page (Fig. 3) where individual studies related to a particular software engineering approach (for example, PSP) are listed with meta-data such as author, source, length, date, and rating information. If interested in a particular study after browsing the titles, the user can click on a hyper link (i.e. ‘‘Read More”) to find a summary (provided by other users) of the given study (Fig. 4). The user also has an opportunity to vote on the relevance of the summary. Users are notified whenever changes are made in any of the contents of the SEEDS database through a Really Simple Syndication (RSS) feed.
Fig. 4. Summary view of the SEEDS prototype.
5.2. Student team project artifacts Students in the course were divided into three teams of four or five students each. Each team independently specified software requirements, proposed software architectures, and built prototypes for SEEDS. We will display and describe some of the artifacts from the student projects including use cases and prototype screen shots. The following set of use cases were identified and categorized by one of the student teams. The labels (US-x) indicate the order in which the use cases were written.
Fig. 2. Initial page of the SEEDS prototype.
Papers – US-1: Add a new paper to the site – US-2: Browse and view existing papers – US-3: Edit a previously submitted paper
D.S. Janzen, J. Ryoo / The Journal of Systems and Software 82 (2009) 563–570
567
– US-4: Flag a paper for removal – US-18: Rate a paper Comparing Papers – US-14: Compare Papers Side-by-Side – US-15: Compare All Papers under Topic Summaries – US-5: Add new Summary – US-6: Edit Summary – US-7: View Summary – US-8: Rate Summary Users – US-9: Register User – US-10: User Login – US-11: Logout User – US-12: Edit User Information – US-13: Request Password Paper Topics – US-16: Add New/Edit Topic – US-17: Select Topic Parent
Fig. 6. Student prototype: browsing paper summaries.
Detailed use case descriptions were created, but are omitted here for brevity. Papers refer to basic data regarding a particular publication including author, publication date, publication type (e.g. conference proceedings, journal), publication venue (e.g. IEEE Software, OOPSLA), applicable topics, source (e.g. URL or digital library), and optionally an abstract. Summaries are summary and evaluative text of a paper, which are supplied by a reviewer who is contributing to SEEDS. Note that both papers and summaries may be rated by SEEDS users (use cases US-18 and US-8). The idea is that the ‘‘best” summaries of a paper would ‘‘bubble up” to the top of the list of summaries, similar to how urbandictionary.com and amazon.com ratings work. Likewise the ‘‘best” papers would ‘‘bubble up” to the top of the list of papers on a particular topic. Figs. 5–7 show screen shots from one team’s prototype. This prototype was built using Adobe’s Flex (Adobe Systems Incorporated) development environment. Fig. 5 demonstrates how a user
Fig. 7. Student prototype: adding summary.
would browse the system for all papers on a particular topic (use case US-2), in this case test-driven development. Fig. 6 demonstrates how a user would browse the system for all summaries on a particular paper (use case US-7). Fig. 7 demonstrates how a user would add a summary for a particular paper (use case US-5). Fig. 8 demonstrates how a user would view a comparison of papers by six criteria on a particular topic (use case US-14). Individual papers and criteria could be deselected in order to filter the papers being compared. Fig. 8 comes from a different team’s prototype created directly in HTML. 6. Evaluation
Fig. 5. Student prototype: browsing papers by topic.
The course activities were assessed in three ways. In order to assess the quality and usefulness of the student-written summaries, a survey was conducted with professional software engineers. A second survey was conducted with the students in order to assess
568
D.S. Janzen, J. Ryoo / The Journal of Systems and Software 82 (2009) 563–570
Professional Survey Responses Favorable Unfavorable EBSE Adoption
Question
Exposure to EBSE
Exposure to ESE
Fig. 8. Student prototype: comparison grid of papers on a topic.
Professional Status Access to a digital library Full-time software developer
Response
Part-time software developer
Digital library Experience
Full-time software manager
0%
20%
40%
60%
80%
100%
Response% Fig. 10. Professional survey results.
Part-time software manager
Administrator or executive
Other (sys admin, analyst, etc.)
0
5
10
15
20
25
30
35
40
45
Response % Fig. 9. Professional status.
their experience and perspective on both the summaries and the team project. Finally, observations from the course instructor will be reported. 6.1. Survey of software professionals A survey of ten questions was sent to software professionals in four companies: Amgen, Google, Intuit, and LSI. The professional status of the respondents is reported in Fig. 9. Although the survey did not request degree information, based on hiring practices it is believed that all of the respondents have at least an undergraduate degree in a computing field. Of the ten respondents, one is known to hold a Ph.D., and one an MBA.
As Fig. 10 reports, half of the respondents indicated that they had used a digital library such as those provided by ACM and IEEE. Again half indicated that they currently had access to such a library. Seventy percent indicated that they had never read a report of an empirical software engineering study (Exposure to ESE). When asked ‘‘I understand how evidence-based techniques are applied to software engineering,” 70% responded less than favorably (Exposure to EBSE). Interestingly, 70% responded favorably (very likely or likely) to the question: ‘‘How likely are you to find and read evidence-based studies prior to adopting a particular software engineering practice, process, method, or tool” (EBSE Adoption). These results, although limited by their sample size, are consistent with our intuition that software professionals are interested in EBSE results, but they lack education in the area. In fact, they even lack the access to research empirical software engineering studies through digital libraries. In order to assess the quality and usefulness of the studentwritten summaries, the software professionals were asked to read a few of the study summaries in two EBSE repositories. The first is hosted at Durham University (see Empirical research repository). The second is the repository prototype created for this study and populated with student-written surveys. As Fig. 11 indicates, the software professionals found the student-written summaries to be at least as useful as the professional-written summaries. In fact 30% more of the respondents found the student-written summaries to be very useful.
569
D.S. Janzen, J. Ryoo / The Journal of Systems and Software 82 (2009) 563–570
Value of Preparing Summaries Response%
Response%
Usefulness of EBSE Summaries: Professional Opinions 80 70 60 50 40 30 20 10 0
45 40 35 30 25 20 15 10 5 0 Very valuable
Very useful
Useful
Neither useful nor not useful
Not useful
Valuable
Not very useful
Neither valuable Not valuable Very not valuable nor not valuable
Responses
Response Professional-Written
Student-Written
Fig. 13. Student survey results: value of writing summaries.
Fig. 11. Usefulness of EBSE summaries: professional survey results.
Value of Developing SEEDS
6.2. Survey of graduate students A similar survey of ten questions was sent to the thirteen students in the graduate software engineering course. All the students responded. Eighty-four percent of the students indicated that they had used a digital library prior to enrolling in this course. However, 92% of the students indicated that they had never read a report of an empirical software engineering study prior to enrolling in this course. When asked ‘‘I understand how evidence-based techniques are applied to software engineering,” all students responded favorably. Similar to the professional respondents, 83% responded favorably (very likely or likely) to the question: ‘‘How likely are you to find and read evidence-based studies prior to adopting a particular software engineering practice, process, method, or tool.” Prior to the survey, the students had not been introduced to the Durham University EBSE repository. Fig. 12 indicates that after being asked to read summaries from both the Durham repository and the repository of their own summaries, nearly 40% more students found their own summaries to be useful or very useful. Figs. 13 and 14 report the student responses to the questions: ‘‘In terms of preparing you for a career as a software engineer, how valuable was the experience of preparing the EBSE summaries in this course?” and ‘‘In terms of preparing you for a career as a software engineer, how valuable was the experience of developing requirements, architectures, and prototypes for the EBSE summary database in this course?” Student comments on the former question indicated that most students enjoyed learning how to critically analyze a study, and they enjoyed learning about a particular software engineering topic in depth. A couple of students however reported that they would have preferred studying a wider range of topics themselves, rather than hearing and reading the reports of their peers. On the latter question, most students reported that they had completed numerous team projects in their undergraduate courses and, although they enjoyed learning some new technologies, most thought their time could have been more usefully spent on other tasks.
Response%
Usefulness of EBSE Summaries: Student Opinions 100 80 60 40 20 0 Very useful
Useful
Neither useful nor not useful
Not useful
Very not useful
Response Professional-Written
Student-Written
Fig. 12. Usefulness of EBSE summaries: student survey results.
Response%
50 40 30 20 10 0 Very valuable
Valuable
Neither valuable Not valuable Not very valuable nor not valuable
Responses Fig. 14. Student survey results: value of developing SEEDS.
6.3. Instructor observations Students initially struggled to understand the loosely defined system proposal. However, once they were exposed to a variety of empirical studies, they were successful in finding and summarizing studies on their own topics. This personal experience demonstrated the value of the SEEDS system to the students, who also proposed interesting and viable alternative requirements, architectures, and prototypes. 7. Conclusions and future work The concept of a community-driven Web database was proposed to engage Net generation students and software professionals with evidence-based software engineering. We deliberately chose the social networking approach of user-generated and reviewed content as a way to implement SEEDS since we thought that students would more easily relate to the course project and be more enthusiastic about it. Graduates from the Cal Poly program overwhelmingly enter careers as applied software engineers, primarily in the technologically rich Silicon Valley and Southern California markets. In the same way that graduating students spread awareness and use of Unix from universities to industry, we believe that these efforts to increase EBSE awareness and skill among university students will result in improved practitioner involvement and appreciation of EBSE as these students enter professional practice. The alternative requirements and designs for the EBSE database generated by student teams will serve as a starting point for a planned widely disseminated system. The student summaries of studies will provide seed data to the system that will offer immediate benefit once the system is deployed. We have demonstrated the viability of incorporating EBSE topics into a graduate software engineering course, and provided initial evidence of its usefulness to professional software practitioners. We encourage software engineering educators to consider requiring that their students critically analyze EBSE studies and contribute their work to the prototype SEEDS repository located at http://www.evidencebasedse.com. SEEDS is expected to
570
D.S. Janzen, J. Ryoo / The Journal of Systems and Software 82 (2009) 563–570
evolve and improve, but every effort will be made to ensure that study summaries be retained in all versions. References Adobe Systems Incorporated.
. Basili, Victor R., Zelkowitz, Marvin V., 2007. Empirical studies to build a science of computer science. Commun. ACM 50 (11), 33–37. Basili, Victor R., Shull, Forrest, Lanubile, Filippo, 1999. Building knowledge through families of experiments. IEEE Trans. Software Eng. 25 (4), 456–473. Carver, Jeffrey, Jaccheri, Letizia, Morasca, Sandro, Shull, Forrest, 2003. Issues in using students in empirical studies in software engineering education. In: METRICS’03: Proceedings of the Ninth International Symposium on Software Metrics, Washington, DC, USA, 2003. IEEE Computer Society, p. 239. Empirical research repository. . Glass, Robert L., Ramesh, V., Vessey, Iris., 2004. An analysis of research in computing disciplines. Commun. ACM 47 (6), 89–94. http://drupal.org. http://www.wikipedia.org. Janzen, D., Saiedian, H., 2005. Test-driven development concepts taxonomy and future directions. IEEE Comput. 38 (9), 43–50. Janzen, David S., Turner, Clark S., Saiedian, Hossein, 2007. Empirical software engineering in industry short courses. In: CSEET’07: Proceedings of the 20th Conference on Software Engineering Education and Training, Washington, DC, USA, 2007. IEEE Computer Society, pp. 89–96. Jedlitschka, Andreas, Pfahl, Dietmar. Reporting guidelines for controlled experiments in software engineering. In: 2005 International Symposium on Empirical Software Engineering. IEEE Computer Society, 2005. Jeffries, Ron, Melnik, Grigori, 2007. TDD – the art of fearless programming. IEEE Software 24 (3), 24–30. Jorgensen, M., Kitchenham, B., Dyba. T., 2005. Teaching evidence-based software engineering to university students. In: 11th IEEE International Software Metrics Symposium, Como, Italy, September 19–22, 2005. Kitchenham, Barbara A., Dyba, Tore, Jorgensen, Magne, 2004. Evidence-based software engineering. In: ICSE’04: Proceedings of the 26th International Conference on Software Engineering, Washington, DC, USA, 2004. IEEE Computer Society, pp. 273–281.
Kitchenham, Barbara, Al-Khilidar, Hiyam, Babar, Muhammad Ali, Berry, Mike, Cox, Karl, Keung, Jacky, Kurniawati, Felicia, Staples, Mark, Zhang, He, Zhu, Liming, 2006. Evaluating guidelines for empirical software engineering studies. In: ISESE’06: Proceedings of the 2006 ACM/IEEE International Symposium on Empirical Software Engineering, New York, NY, USA, 2006. ACM Press, pp. 38– 47. Sjoberg, Dag I.K., Dyba, Tore, Jorgensen, Magne, 2007. The future of empirical methods in software engineering research. In: FOSE’07: 2007 Future of Software Engineering, Washington, DC, USA, 2007. IEEE Computer Society, pp. 358–378. Steve Easterbrook. Csc2130s: Empirical research methods in software engineering. . Wayt Gibbs, W., 1994. Software’s chronic crisis. Sci. Am. (Int. Ed.) 271 (3), 72–81. Zelkowitz, Marvin V., Wallace, Dolores R., 1997. Experimental validation in software engineering. Inform. Software Technol. 39 (11), 735–743. Zelkowitz, Marvin V., Wallace, Dolores R., Binkley, David W., 2003. Experimental validation of new software technology, pp. 229–263.
David S. Janzen is an Assistant Professor of Computer Science at California Polytechnic State University (Cal Poly) in San Luis Obispo, and president of Simex, a software consulting and training company. Previously he worked on telecommunications fraud detection systems at Sprint, and taught at Bethel College in Kansas. His teaching and research interests include agile methodologies and practices (particularly test-driven development), empirical software engineering, software architecture, software metrics, and software engineering pedagogy. He holds a PhD in Computer Science from the University of Kansas. Jungwoo Ryoo is an Assistant Professor of Information Sciences and Technology at the Pennsylvania State University, Altoona. His research interests include information assurance and security, software engineering and computer networking. He conducts extensive research in software security, network/cyber security, security management (particularly in the government sector), software architecture, Architecture Description Languages, object-oriented software development, formal methods and requirements engineering. He also has a significant industry experience working with Sprint and IBM in designing and implementing secure, highperformance software for large-scale network management systems. He received his PhD in Computer Science from the University of Kansas in 2005.