software development A software technology evaluation program by D A V I D
Abstract: A weahh q/potentially beneficial sq[tware engineering tools, practices, and techniques has emerged in the past Jew years. However, many Of these innovations have never been empirically validated. The S~?[tware Engineering Laboratory (SEL ), a cooperative projeet ~[ the National Aeronautics and Space Administration, Computer Sciences Corporation, and the University 0/ Mao'land, conducts an extensive technology evaluation program. The SEL studies the production q/" FORTRAN sol?ware./or spacecra[t navigation systems. This paper summari:es the results o / S E L investigations in the areas q[ design practices, coding techniques, test/ verification methods, and computer utilization. An earlier version 0/ this paper appears in The Annals of XVIII Brazilian National lnformatics Congress, September 1985. Keywords. so/iware engineering, so/iware validation, programming languages.
oftware provides an increasingly larger part of the functionality, and consequently the cost, of computer systems. For some large applications, the cost of the software component exceeds 75% of the total system cost. The ability to deliver reliable software on time at minimum cost has become essential to success in the computer industry. Software development organizations, therefore, have strong incentives to improve the software development process, principally by adopting new technology. In the context of this paper, the term 'technology' refers to tools, practices, and techniques applied by software developers. Although many potentially beneficial software engineering tools, practices, and techniques have emerged in the past several years, many have never been empirically evaluated ~. The difficulty of accurately measuring the software development process, in general, and technology use, in particular, accounts for much of the lack of
N CARD
objective information in this area. Furthermore, the effectiveness of any technology may vary from one environment and application to another. This paper describes an ongoing technology evaluation program 2 conducted by the Software Engineering Laboratory (SEL) that is intended to resolve these issues, at least in part. The program attempts to implement a process engineering function in software development like that described by Agresti 3 and Card et al 4.
Software engineering laboratory The SEL is a research project 2 sponsored by NASA and supported by Computer Sciences Corporation (CSC) and the University of Maryland (UM). The SEL studies software developed to support spacecraft flight dynamics applications at Goddard Space Flight Center. Figure 1 shows the organization of the SEL, which was established in 1977. Although most funding for SEL projects comes from NASA, both CSC and UM contribute resources to projects of special interest. The ongoing ADA project (see
s
National Aeronautics and Space Administration
Computer I
t
Sciences
I
Corporation
~
~
vol 29 no 6 july/august 1 9 8 7
of
[
Maryland
NASA/CSCflight - ~ uew,v~- ] dynamics projects
Key: Data -- ~
Computer SciencesCorporation, 8728 ColesvilleRd, SilverSpring, MD 20910, USA
University
SEL
Funding/technical direction
Figure 1. Software Engineering Laboratory ( SEL )
0950-5849/87/060291-10503.00 ~,. 1987 Butterworth & Co (Publishers) Ltd.
291
section on technology evaluation results) provides an example of this close cooperation. The overall objective of the SEL is to understand the software development process in the flight dynamics environment and to identify the ways in which it can be altered to improve the quality and reduce the cost of the product. The SEL has monitored the development of more than 50 flight dynamics projects involving hundreds of programmers and analysts. In addition, the SEL conducts controlled experiments and other studies. The sources cited in the references provide detailed information about the results summarized here.
Flight dynamics software The general class of spacecraft flight dynamics software studied by the SEL includes applications to support attitude determination/control, orbit adjustment, manoeuvre planning, and mission analysis for satellites in Earth orbit. The attitude ground support systems form a large and homogeneous group of software that has been studied extensively. Each system includes telemetry processor, data adjuster, and attitude computation subsystems as well as other necessary supporting functions. Flight dynamics applications are developed in FORTRAN on IBM mainframe computers. System sizes range from 30 to 150 thousand source lines of code (85% FORTRAN). Reliability and response time requirements are moderate. That is, software failures do not immediately endanger the spacecraft; ground-based software does not exercise real-time control over the spacecraft. The fixed spacecraft launch data imposes a severe development time constraint. Acceptance testing must be completed two months before launch so that launch preparations can proceed on schedule. Table 1 summarizes some important characteristics of flight dynamics software development. Table 1. Flight dynamics software Project characteristics Duration (months) Effort (staff-years) Size (1000 LOC) developed delivered Staff (full-time equivalent) average peak individuals Application experience (years) managers technical staff Overall experience (years) managers technical staff
292
average
high
low
16 8
21 24
13 2
57 62
142 159
22 33
5 10 14
11 24 29
2 4 7
6 4
7 5
5 3
10 9
14 11
8 7
Technology improvement approach The SEL program of technology evaluation includes three steps: • measurement • evaluation • transfer Measurement establishes the baseline against which the effects of technologies can be compared. Next, technological innovations are attempted and their effects evaluated. After careful study, successful technologies are transferred to developers via guidelines, standards, and training.
Measurement All engineering disciplines derive from principles of measurement and relationships among measures. Software engineering must develop such a measurement basis to become a true engineering discipline. Software engineering experts such as Boehm 5 (for cost estimation) and DeMarco ~' (for project control) are paying increased attention to the role of measurement in software development. Accurate measurement enables the establishment of performance baselines for comparing and estimating future projects. Also, innovations can be evaluated against the baseline. The SEL developed a comprehensive data collection methodology 7 as the basis for its measurement activity. Measures collected include staffing, computer utilization, error reports, and product size/complexity measures, as well as the level of technology applied to each project. The SEL employs both questionnaires and automated methods of data collection. The collected data are assembled in a computerized database accessible to all SEL participants. Because the software development process is complex and involves many different human and physical elements, many measures are needed to characterize it adequately. Examination of a model of software development, such as that shown in Figure 2, helps to define a comprehensive but nonoverlapping set of measures. This model includes the following components: • Problem statement of the information needs for which a software solution is desired. • Personnel - software development team, managers, and supporting personnel. • Process practices, tools, and techniques employed by the personnel to develop the product; it proceeds in a series of steps (the software life cycle). • Environment - physical and informational resources and constraints within which the personnel and process operate.
information and software technology
software development Environment .......
@ /
/
/
/
/
/
/
/
\
/
\ \ \ \ \ \ \
//
\\
//
\ \
//
\
//
\
Processphases
analysis
design
Detailed design
\
Implementa tion
System test
Acceptance/ test
Figure 2. Sq/tware development model • Product software and documentation that solve the problem. Measures are needed to characterize the principal attributes of these components before the relationships among the components can be determined. A complete set of measures constitutes a profile. Software development profiles form the baselines against which potential technology improvements are evaluated. Evaluation
The process of identifying the costs and benefits associated with a technology is termed evaluation. Evalu-
ation relies on the data collected by the measurement activity. Evaluators may follow one or more of three general approaches s. First, case study. This is an indepth comparison of two projects (one with new technology) or comparison of the new technology project to an established baseline. Second, multiproject variation. This is a general measurement and analysis of varying levels of technology used in a large number of projects. Third, a controlled experiment. This involves selection and control of subjects, tasks, methods, and environments in accordance with a predefined statistical design. Even after assembling a substantial software engineering database, other obstacles to accurate technology evaluation remain, such as technologies which tend to be applied together; sample sizes tend to be small, and many nontechnology factors also affect the outcome of a software development project. Programmer performance (Figure 3) proved to be the most important factor with respect to both productivity and reliability. As noted by Boehm s, these complications prohibit a simplistic statistical analysis and interpretation of the software development measures collected. Nevertheless, some trends in the data from SEL studies are clear.
Transfer When the effectiveness of a technology has been demonstrated, the next step is to transfer it to software developers. The principal mechanisms used by the SEE to accomplish technology transfer include disseminating Small projects
Large projects 12
12-
11
11-
10
10
Max
9 Max
8 :_~ 0 e~
7
._> Avg
///////
8
Y////~
76-
~
Avg
o
5
5-
//.b.4//~ /////// "/I/I/// Min
¢//
4 ///////
3 2-
///////A
Oi
1 0
/ / i / 1 1 /
Min 0.5
....... ///////
¢1///// ~//////
(//////k ~llll//
Notes: Large projects are greater than 20 000 source lines of code. Productivity is source lines per staff hour.
Figure 3. Programmer productivity variations
vol 29 no 6 july/august 1987
293
guidelines, developing tools, and conducting specialized training. The guidelines produced by the SEL cover management procedures 9, testing/verification 1°, and programming practices ~1. Two important SEL-developed tools are the source code analyzer program '2, which has been distributed across the United States, and the configuration analysis tool 13, which is tailored to specific flight dynamics needs. The source code analyzer program counts operators, operands, decisions, statements, etc., in FORTRAN. Recent modifications by Harris Corporation and the Rome Air Development Center adapted it to analyze ADA code. Currently, SEL researchers are designing a training program for the ADA language ~4 and appropriate design techniques.
Technology evaluation results The SEL has extensively studied four software technology areas: • design heuristics, • programming practices, • testing/verification, • computer support/automation. Although a great many other researchers have made significant contributions in these areas, this paper discusses only SEL results because of space limitations. Initial SEL studies focused on the implementation phase because most errors in the software occurred in that phase, even though it requires only one-third of the total effort (Figure 4). It thus appeared to be a leverage point ~5 for process improvement. However, subsequent investigation showed that requirements errors were the most costly to correct, whereas design errors were much more
E ffort
prevalent than first indicated. Ultimately, SEL efforts encompassed the entire life cycle.
Design heuristics Modules are the basic units of design in the structured approach 16. Design heuristics are rules and practices that specify how to design a good module. Those studied by the SEL include control coupling, data coupling, module size, and strength/cohesion. Figure 5 summarizes the results obtained with respect to fault rate. For example, 50% of high-strength modules were fault free while only 18% of low-strength modules were fault free. Specific conclusions are as follows: • • • •
Module size was not a useful design criterion '7. High strength/cohesion promoted quality '7. Parameter coupling was not better than common TM. Smaller spans of control were generally better TM.
All design practices are not equally effective for all applications, environments, languages, etc. Furthermore, changes in computer hardware and software technology may affect the value of a practice 18. For example, most negative conclusions about the value of common coupling were reached before include processors were widely available. Clearly, objective measurement is essential in making informed, current decisions about design strategies.
Programming practices One group of individual technologies, referred to as modern programming practices, tends to be applied together. These technologies provide a flexible methodology
Errors
Figure 4. Origins o f cost and errors
294
information and software technology
software development 42%
46 %
36 %
33 %
31%
32 %
32 %
22 %
I
High
Medium
Small
Low
Medium Large
Strength
Parameter
Size
Mixed Common Data coupling
Many
Normal
One
Control couples
Figure 5. Design practices for the (detailed) design, implementation, and verification of software. As practiced in the flight dynamics environment, the principal components are as follows: • • • • •
informal program design language, top-down development, structured programming, code reading, structured FORTRAN preprocessor or compiler.
These measures of individual technology use (1 = low use; 5 = high use) were combined to form a single index of overall structured programming use. Figure 6 shows the
%
0.010
relationship of this index to error rate. The use of modern programming practices appears to be associated with a reduced error rate, with the principal contributor being the reading of structured code 19. No significant correlation with development productivity was found. This implies that the reliability and maintainability benefits of modern programming practices are obtained at no additional cost in terms of development effort. An experiment currently under way will compare the ADA and FORTRAN programming languages by implementing the same system twice, once in each language 2°. Figure 7 shows some early results from the ADA training Cost (hours/1000 exec. stmts.)
• %
Error rate (errors/1000 exec. stmts)
% %.
0.009
%, % %, • %,
0.007
£
,,. - • ,,...,
\o
0.008
%
\
% % \ •
0.006
ktJ
,,...
%
%, %,
0.004
%
•
ii!i!! i?:ii!i!i i
, . . . , . - - . % . , • . . . , , - . . . . ,, ,,,,.,...-,,,,..% - ,.. • .,-. • .,. • %. • .,. - • ,,. .,,, ..,,,,.,..,. . , , . . % . . , . , -,- .,..,.. • ,, , -,.,..,, .,. • .. , • ,,.... • • ,,. • ,.. -..,. ,,.. • ...-,.,%,.,,.,.• • ,,.,, • .., •. ,,.,. - •, •, ,..,
%,.,.-,,.., .,,,.,--%.,,,
•
• ..,,.,,,.. - • ,.,. . , . , . . . . . , , - • ,,.. -....,,., • .,, .,. ,.,.,
%, %, \
• ...,-, .., •, ,,.,•, .. •,., • ,.,, .,- ,.,,,,, • .....,. - • • • . , . * , , . . , , .
%,
• •
%
%, ~
720
i!i!iiiiii!!i!iiiiii i!iiiiii!!i!iiiiiii! i!iiiii!i!i!iiiii!!!
%,
%, % %, • %,
3.750
.,, • •.
. , , . . , , , . . %
%,
0.003
,-
.,.
!iiiiiiiii!!iiiiii!i
0.005
0.002 -
12
ii!ii~.~.°..ii!iiii ?i?i?i?i?ii!?i?i?i?i iii!iiiii!iiiiiiiii!
I
r
I
I
I
I
6.250
8.750
11.25
13.75
16.25
18.75
Modern programming practices
, . . . . % , . , • .%, ,,. - • , , . , % , , . , - - . . . , , - . -,.,..,.,,, • ,, • • ..,. • *., ,,., •
ADA
~
•
,,..,.,...,,....
-
. .., • -. ,.,..., , . . . . . , . . % , , , . ,,., -.-,,,, ..%
,
ii!ii!i!iiii!!i!iiii iiii!!!!iiii!i!!i!ii
, . , , ,,, ,, . , , , , . . %,,. •. ,. ,,,.
. - • ..,,.,- • • , - , . % , , . . - .
::::::::::::::::::::
F O R T R A N
* ~
A D A *
Notes: Error rate is errors per developed source line of code. Modern programming practices is index combining six measures. Correlation (r) = - 0 . 6 4 (P < 0.05).
* Includes 5370 SLOC 1402 exst. does not include 570 training hours * * Based on flight dynamics projects
Figure 6. Effect
Figure 7. Software development in ADA
(.~/" modern
vol 29 no 6 july/august 1987
programming practices
F O R T R A N * *
,
295
project conducted as part of this experiment. The cost and error rates for the ADA development lie within the 95 % confidence interval for past FORTRAN projects.
Testing/verification SEL investigations have encompassed both techniques and organizations for testing and verification. A recent study 21 compared the effectiveness of three verification techniques (Figure 8): • code reading • functional testing • structural testing In this controlled experiment, using professional programmers, code readers significantly outperformed both functional and structural testers in terms of the percent of errors detected and effort per error detected. Quality assurance includes all review and management procedures undertaken to ensure the delivery of an effective and reliable product. The specific technologies studied by the SEL are as follows: • • • • • • • • • • • •
requirements reviews design reviews design walkthroughs code walkthroughs test formalism test followthrough methodology reinforcement doc~ument quality assurance development standards code configuration control code library (PANVALET) configuration analysis tool
of technology use were combined to form a single index of overall quality assurance activity (percentage of technologies used). Figure 9 shows the relationship of this index to error rate. Quality assurance activity appears to be associated with a reduced error rate 19. No significant correlation with development productivity was found. This implies that the reliability benefits of quality assurance are obtained at no additional cost in terms of development effort. Moreover, a cost savings should be realized during maintenance. Independent verification and validation (IV and V) is an approach to software quality assurance in which each product of the development team is assessed by an independent team. The major functions of an IV and V team are to ensure that the product of each phase of the software life cycle is consistent with the product of the previous phase (i.e. verification). Also an IV and V team ensures that the product of each phase accurately responds to the original system requirements (i.e., validation). Some of the claimed benefits of IV and V are as follows: • Earlier detection of errors, • More complete detection of errors, • Improved visibility to management. The SEE compared performance measures from two projects to which IV and V was applied with two similar non-IV and V projects zz. Figure 10 shows that IV and V appeared to increase development cost without reducing the error rate. However, a subjective assessment indicated that visibility to management did increase.
Computer support and automation
For the analysis described here, the individual measures
Another major factor in software development is the computing environment. Special changes to this environNumber of faults detected per hour of effort
% o f faults detected 61
3.3 ...,.................,...
i!iiiii!iiiili151!i!i!iiiii!ii ::.:.:.:.:.:.:....,.,:.:.:....
::::::::::::::::::::::::::::::::::::
:i:!:i:i:i:i:!:!:!:i:i:i:i:i:i:i:i:i ::::::::::::::::::::::::::::::::::::
ii??!!!)iiii!i!i?i!)!ilil)iil)ii!i!i • .'.'
•
' . - . . .
.-
•
•
•
...H.,
....,..,............,...,......... ...........................
:::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::::::::: ::::::::::::::::::::::::::::::::::::
:::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::::::::: ================================= • .-.. • ..-H...
Iii i i i
-....,........
ililiiiiiiiliiiiiiii?iiiiiiiiii;:~:; Code
Functional
reading
testing
Structural testing
Code reading
Functional testing
Structural testing
Figure 8. Software ver(fication techniques
296
information and software technology
software development 0.010
% % % %
0.009 •
%
,% %
0.008
% % %
0.007
\ 0.006
% %
uJ
• % •%
0.005
%0
,%
•
% •
0.004
% %
%
% % % % % %
0.003
•
0.002 -
)
I
I
I
I
20
30
40
50
60
%
%
70
80
Quality assurance
Notes: Error rate is errors per developed source line of code. Quality assurance is index combining 12 measures. Correlation (r) = -0.60 (P < 0.05).
Figure 9. Eff'ect ~/" quality assurance
ment usually cannot be made to conform to the needs of any single project. Nevertheless, it can have a strong effect on the development process ~9. Initially, the flight dynamics computing environment only provided the programmer with facilities for editing, compiling, linking, and testing source code. Figure I I shows that most of this resource is expended during implementation and test. A recent SEL study showed that extensive computer use is associated with low productivity ~9. Heavy computer users may not spend enough time desk checking and planning their work before jumping into code and test. However, another recent study/s indicated that computer support, through software tools, for design and planning activities, not previously provided, can increase the overall productivity and reliability of the software development process. Figure 12 shows the reaction of users to two workstation demonstration systems24; the response was overwhelmingly favourable. Current SEL research efforts focus on expert systems for management and programming support environments. However, these projects are still in their early stages.
Conclusion SEL experience demonstrates that the software development process in a specific environment can be improved
Error rate
Development cost 3.00
8
2.75 2.50 IV and V overhead
~
2.25 5.2
IVand V team 0.16
Implementation
I Services
2.00 (.9 O -J t~
4.7
4
O J o v
1.75
a=
1.50
0.09
1.25
~'~035,~,~
O
E
LU
E
O
iiiiililiiiiiiiiiiiii~iiiiiiiiiiiiiiiilili!~i
~
03
1.00
0.75 ~
~
~ ~ ~,x,~Q,,~ ~
Acceptance testing
v/
//~/'/A
~/~0.//99/~ Operat io n
0.50
Y/~,! '.3.8,~'/~ Programmers
U/.0.99Y/
0.25 0.00
Past projects
IV and V projects
Managers
Past projects
IVand V projects
Figure 10. Independent veHfieation and validation
vol 29 no 6 july/august 1987
297
300 Reqts[ Preliminary analy-I design sis
Detailed design
Implementation
I [ I I I t
System testing
Acceptance testing
m
2OO t~
"5
v
100
I--
........
/
~
e
weeklycomputeruse/
/ I
E O
I
I I
I I
I
I I I
I I 25
50
75
% of development schedule
Figure 11. Computer utilization profile
Improves quality?
Reduces time? 93
Reduces
effort?
iliiiiiiiiiiiiiiiiiii iii!i!ilili ii:ii!!i!
iliiiiiiiiii!i!iiiiililiiii!: iilliiiiiiii!ililililiiiiil
!iiiiiiiiii!iii!ii!ii!i!i!i!i!ili!iliii!ii!i!i!i!i!il
85
71
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
:i:i:i:i:i:i:]:~:~:i:i:~?i:i:~:~:~:~:~:~:~:~:~:~:~:~:~:!:!:!:!:!:i:!:i ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: +:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+: ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
i:i:i:!:i:i:i:i:i:i:i:i:i:i:~:i:i:i:i:~:i:i:i:i:i:i:i:i:i:i:i:i:i:i:i:i +:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+ ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: +:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+ ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: +:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+ :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
~:~:~:i:i:~:~:~:~:~:~:~:~:~:~:~:~:!:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~?~:~ ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: i:~:~:~:i:~:~:~:i:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:i:~:~:~:i:i:~:~:i:~
iiiiiii!i!i!ilili!ili!iliiiiiiililililililiiiiiiilililililililililiiii! :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+ ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: :+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+ ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: +:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+ ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: +:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+ ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: +:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+:+ .......................................................................
iiii!ii!ii!i!iii!i!i!i!i!i!i!i!i!i!i!i!i!i!i!i!i!iii!i!i!i,: i:i:~:~:~:i:~:~:~:~:~:~:~:~:~:~?~:~:~:~:~:~:i:!:!:!:!:!:!:!:!:i:!:!:!
% Yes
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: .:.:.:.:.:.:.:+:.:.:.:+:.:.:.:.:.:.:.:.:.:.:.:.:.:.:.:.:.:.:.:.:.:. ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
% No
:::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
Exceierator
Case 2000
Excelerator
Case 2000
Excelerator
Case 2000
Figure 12. Design workstations
298
information and software technology
software development by a systematic program of measurement, technology evaluation, and transfer. Other similar organizations can also apply the lessons learned by the SEL. First, the use of appropriate design, programming, and verification practices increases software reliability without noticeably increasing development cost. Second, a formal program of quality assurance also improves software reliability at little or no net cost. While many modern programming concepts are firmly established in software engineering practice, formal quality assurance procedures are only now coming into widespread use. Third, intensive computer use appears to be associated with low productivity. Programmers who spend a lot of time at the terminal tend to be less productive. Additional computer support alone does not necessarily increase productivity. It must be the right kind of support. In summary, these results suggest that a formal and conscientious method of software development yields a more reliable product. On the other hand, methods alone do not appear to improve development productivity; automation is needed too. It is very difficult to reduce the cost of developing a software product, although a more reliable product should require less subsequent maintenance. Methods and tools are synergistic; computer and tool support must be integrated into a formal process to be effective. SEL experience indicates that progress toward productivity and quality improvement is evolutionary rather than revolutionary. It proceeds in small steps. Despite technological advances, the major factor in both productivity and reliability continues to be personnel capability and performance (see Figure 4 ) 2 5 .
Acknowledgment The author would like to recognize the contributions of F E McGarry (Goddard Space Flight Center), G T Page (Computer Sciences Corporation), and V R Basili (University of Maryland) to the material presented here.
References I Shell, B A "The psychological study of programming' A CM Computing Surveys (March 1981) 2 Card, D N, McGarry, F E, Page, G T et al The Software Engineering Laboratory, National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC) SEL-81-104 (February 1982) 3 Agresti, W W 'Applying industrial engineering to the software development process'Proc. IEEE Computer Society Twent v- Third Int. Con[~ (September 1981)
vol 29 no 6 july/august 1987
4 Card, D N, Clark, T L and Berg, R A 'Improving software quality and productivity' lnfo. and Software Technol. Voi 29 No 5 (1987) pp 235-241 5 Boehm, B W Software Engineering Economics Prentice-Hall Englewood Cliffs N J, USA (1981) 6 DeMareo, T Controlling Software Projects Yourdon Press New York, USA (1982) 7 Church, V E, Card, D N, McGarry, F E etMGuide to Data Collection NASA/GSFC SEL-81-101 (August 1982) 8 Basili, V R, Selby, R W and Hutchens, D H 'Experimentation in software engineering' IEEE Trans. Software Eng. (July 1986) 9 Agresti, W W, McGarry, F E, Card, D N, et al Manager's Handbook .for Software Development, NASA/GSFC SEL-81-205 (April 1984) 10 Card, D N, Edwards, E and Antle, C So/?ware Testing and Ver(fi'cation NASA/GSFC SEL-85-005 (December 1985) 11 Wood, R Programmer's Handbook Jor Flight Dynamics Software Development NASA/GSFC, SEL-86001 (March 1986) 12 Decker, W J and Taylor, W A F O R T R A N Static Source Code Analyzer Program User's Guide, NASA/GSFC SEL-78-202 (April 1985) 13 Decker, W J and Taylor, W A Configuration Analysis Tool System Description and User ~' Guide NASA/GSFC SEL-80-104 (December 1982) 14 Basili, V R 'Analysis of software development in ADA' Proc, Ninth Annual Software Engineering Workshop NASA/GSFC SEL-84-004 (November 1984) 15 McGarry, F E 'Software engineering leverage factors' Proc. Eh, venth Annual Sq(tware Engineering Workshop (December 1986) 16 Stephens, W P, Meyers, G J and Constantine, L L 'Structured design' IBM Syst. J. No 2 (1974) 17 Card, D N, Page, G T and McGarry, F E 'Criteria for software modularization' Proc. IEEE Eighth Int. Conf on Software Engineering (August 1985) 18 Card, D N, Church, V E and Agresti, W W 'An empirical study of software design practices' IEEE Trans. S ~ w a r e Eng. (February 1986) 19 Card, D N, McGarry, F E and Page, G T 'Evaluating software engineering technologies' IEEE Trans. Software Eng. (July 1987) 20 Agresti, W W 'Measuring ADA as a software development technology in the software engineering laboratory' Proc. Tenth Annual Software Engineering Workshop (December 1985) 21 Selby, R W, Basili, V R, Page, G T and McGarry, F E 'Evaluating software testing strategies' Proc. Ninth Annual Software Engineering Workshop (November 1984)
299
22 Page, G T, McGarry, F E and Card, D N 'A practical experience with independent verification and validation' Proc. Eighth Int. Computer Software and Applications Conf. (November 1984) 23 McGarry, F E, Valett, J D and Hall, D L 'Measuring the impact of computer resource quality on the software development process and product' Proc. Hawaiian Int. Con]~ on System Sciences
300
(January 1985) 24 Koerner, K, Mital, R, Card, D N and Maione A 'An Evaluation of Programmer/Analyst Workstations' Proc. Ninth Annual Software Engineering Workshop NASA/GSFC SEL-84-004 (November 1984) 25 McGarry, F E 'Measuring Software Technology' Proe. Seventh Annual Software Engineering Workshop NASA/GSFC SEL-82-007 (December 1982) []
information and software technology