A practical approach to specification technology selection

A practical approach to specification technology selection

A Practical Approach to Specification Technology Selection * Margaret J. Davis and David R. Addle~an Boeing Aerospace Company, Seattle, Washington Ov...

1018KB Sizes 1 Downloads 204 Views

A Practical Approach to Specification Technology Selection * Margaret J. Davis and David R. Addle~an Boeing Aerospace Company, Seattle, Washington

Over the past decade numerous methodologies were introduced to guide and ease the project manager’s burden over the entire life cycle of system/software development, Additionally, various software tools were developed to support specific methodologies. In fact, the collection of methodologies and support tools grew so rapidly that a project manager was left with a confusing array of choices. Some method was needed for comparing project requirements against the methodoloQy capabilities in order to find the best fit. The Air Force awarded a contract to Boeing Aerospace Company to address the problem. The stated objective of the Specification Technology Guidelines contract was to organize existing information on requirements and design technoloQies into a Guidebook that could be used by USAF technical managers in selecting appropriate methodologies for future projects, We will describe an approach that integrates such concepts as project significance level, software categorization, and life-cycle requirements, and implements them by means of a table-driven procedure that could easily be computerized. Using this approach, the technical manager measures project significance and requirements against the functions available in various methodologies: selects candidate methodologies best fitting a project; and, finally, assigns a score to the candidates in order to make a final methodology selection.

1. INTRODUCTION

During the 197Os, many techniques and tools were developed to support system and software development processes. Most development concentrated on programming and testing activities, but a mini-proliferation of specification and analysis tools occurred for supporting

*This research was supported by Rome Air Development Center contract RADC F30602-84-C-0073. Address correspondence to Margaret J. Davis, Boeing Aerospace Company, P.O. Box 3999, mfs 82-53. Seattle, WA 981242499.

front-end life-cycle activities: system requirements analysis, software requirements analysis, and software design. Complex specification methodologies appeared (e.g., based on data ilow [ 13, control flow 121, and finite state machine [3] modeling techniques) that incorporated specialized analysis tools (e.g., formal languages, graphical descriptions, static analyzers, etc.) and, although much was written about their capabilities, problem domains, and relative degrees of sophistication and success, no effective focus existed for helping a software development manager confidently select one for a future project. In this situation, the technical manager can be perplexed by the claims, counter-claims, comparisons, and evaluations of existing specification methods, most of which are uncorroborated. Ideally, a manager would like to find a methodology suitable for all software deveIopment projects. Unfortunately, no current methodology is universally applicable to the specification of all problem environments. The Specification Technology Guidelines effort was conceived to provide focus and organize information about methodologies and tools. The purpose was to provide Air Force system and software development managers with uncomplicated guidelines for matching existing methodologies and tools to specific projects. This paper presents the conceptual framework on which the guidelines were built, explaining the rationale and outlining implementation. In it we demonstrate how our table-driven selection approach enables the user to: establish a significance level for a project; determine the capabilities needed for the requirements analysis or the design phase and the software category; measure the significance and capabilities against the functions available in various methodologies; select candidate methodologies that best fit project needs; and finally, assign a score to the candidates and make a final methodology selection. The complete guidelines are available from Rome 285

The Journal of Systems and Software 6,285294 0 Elsevier Science Publishing Co., Inc., 1986

(1986) 0164-1212/86,‘$3.50

M. J. Davis and D. R. Addleman Air Development Center [4]. A description of the effort has also been published [5]. 2. SELECTION RATIONALE We wanted to develop guidelines that would aid a project manager in selecting a final candidate methodology and supporting tools, in terms of a particular software category and life cycle phase. Our selection process matches the methodology’s abstract modeling capability to the conceptual needs of the software category and life cycle phase. This match fulfills the explicit project objective. In addition, the guidelines fulfill an implicit project objective of selecting a methodology that is practical for the user’s project. By matching the relative power of the methodology and support tools to

PHASE

1

PHASE 2

PHASE 3

PHASE 4

the relative significance of the project, the guidelines are practical considerations to tailor final selection. Matching methodology capabilities to life cycle phase and software category is the conventional ap preach to methodology selection. Each life cycle phase requires different techniques since each is as distinctly different as requirements analysis is from design. Software projects can be grouped into categories differentiated by the concepts and techniques appropriate for their description and development. For example, realtime control systems software is different from information systems software, and the development of each uses different techniques (i.e., the former emphasizes control logic; the latter emphasizes data modeling). Common sense suggests selecting specification methodologies in accordance with the size and complexity of

PHASE 6

PHASE 6

SOFTWARE REQUIREMENTS ANALYSIS PRELIMINARY DESIGN DETAILED DESIGN CODING AND UNIT TESTING CSC INTEGRATION/ TESTING CSCI TESTING

software iequirements 4nalysis:

Define and analyze functional, performance, interface, and qualification requirements for each CSCI.

‘reliminary 33ign:

Develop a top-level design of each CSCI which completely reflects the requirements specified in the SRS and IRS(s).

Detailed Design:

Develop a modular! detailed design for each CSCI.

Coding and Unit Testing:

Code and test each unit making up the detailed design.

csc Integration and Testing:

Integrate units of code entered in the developmental configuration and perform informal tests on aggregates of integrated units.

CSCI Testing:

Conduct formal tests on each CSCI to show that the CSCI satisfies its specified requirements. Record and analyze test results.

Figure 1. DOD-STD-SDS opment life cycle.

software devel-

287

Specification Technology Selection the software project. In order to quantify this practical notion, we introduced two concepts: methodology power and significance level. Methodology power is a measure of the support a methodology provides and is derived by rating its individual capabilities and techniques. Significance level is a multifaceted measure of the relative importance of a software project. We view the matching of methodology capabilities to life cycle phase and software category as using methodology-in-the-small selection criteria. We view the matching of methodology power to significance level as using methodology-in-the-large selection criteria to further refine the choice. The methodology-in-the-small concept strongly parallels the notion of programmingin-the-small [6], since it is concerned with the features of and concepts expressible by a methodology. The methodology-in-the-large concept is a weaker parallel to programming-in-the-large, being concerned with choosing a practical level of support for project control and correctness. We have discussed life cycle phase, software category, methodology power, and significance level without providing definitions. The next four subsections define these terms and provide a more detailed discussion of each. 2.1. Life Cycle Phase The life cycle model we used is the DOD-STD-SDS (DOD-STD-2167) standard, that consists of six phases: (1) Software Requirements Analysis, (2) Preliminary Design, (3) Detailed Design, (4) Coding and Unit Testing, (5) CSC Integration and Testing, and (6) CSCILevel Testing. All analysis performed prior to phase 1 is termed Pre-Software Development. The six phases and their relationship to the basic project during the software development life cycle are shown in Figure 1. The guidelines are intended as an aid to the project manager during the requirements and design phases of the software acquisition life cycle. For our purposes, we considered requirements analysis as including both system requirements analysis and software requirements analysis. We also treated software preliminary design and software detailed design as a single phase.

2.2. Software Category Under a previous contract [7], we had surveyed Air Force missions and determined that most software development fell into 18 categories. We used those 18 categories in developing our guidelines, but could have used any other set of software categories. A subset of the software category definitions appears in Figure 2. The Ada Methodman committee, for example, is de-

veloping a set of software categories [ 81 which promises to be more applicable to industry. 2.3. Methodology

Power

2.3.1. Definition. The support power of a methodology is the sum of its individual support ratings for a set of capabilities. In general, the factors that determine the relative power of one methodology to another are:

1. 2. 3. 4.

Formality of notation Complexity of specification produced Rigor of mathematical foundation Degree of automated support

For each methodology we rated 30 individual capabilities on a scale of 0 to 3, where 0 is no support and 3 is most support. More precise definitions of these ratings for specific capabilities may be found in [ 51. Six capabilities related to requirements specification are: 1. 2. 3. 4. 5. 6.

State modeling Data flow modeling Control flow modeling Object modeling Timing performance specification Accuracy performance specification

Fourteen capabilities related to design specification are: 1. Functional decomposition

2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14.

Data decomposition Control decomposition Data abstraction Process abstraction Database definition Concurrency/synchronization Module interface definition Formal verification Configuration management Completeness analysis Consistency analysis Ada compatability Notation for code behavior

specification

Ten capabilities were more general and independent of life cycle phase and software category: 1. 2. 3. 4. 5. 6. 7.

Prototyping Test plan generation Automated tools available Traceability Transition between phases Validation Usability

M. J. Davis and D. R. Addleman CategO~

Description

Characteristics

(7) Orbital Dynamics

Control-dominated processing

complex

Resembles navigation and flight dynamics software, but hsa the additional complexity required by orbital navigr+ tion, such as a more complex reference system and the inclusion of gravitational effects of other heavenly bodies.

(8) Message processing

Data-dominated processing

complex

Handles input and output messages, processing the text Or information contained therein.

(9) Diagnostic

Data-oriented

(10) Sensor cessing

S/W

and signal

pro-

(12) Database

Control-dominated processing

complex

Complex, depending entity being simulated

(11) Simulation

management

Data-oriented

Used to detect snd isolate hardware errors in the computer in which it resides or in other hardware that can communicate with that computer.

processing

on

processing

Figure 2. Software categories.

8. Maturity 9. Training/experience level 10. MIL-STD documentation 2.3.2. Rating a capability. capability follow:

Similar to that of message except that it processing, requires greater processing, analyaing, and transforming the input into a usable data processing format. Used to simulate rm environmission ment, situation, other hardware, and inputs from these to enable a more realistic evaluation of a computer program or a piece of hardware. Manages the storage and access of (typically large) of data. Such groups can also often software prepare reports in userdefined formats, based on the contents of the database.

relative to the life-cycle phases and software categories. 5. Rate the methodologies for that capability.

2.4. Significance

The steps for rating a

Define the capability. Define the significance level ratings for the capability. Decide the group (requirements, design, or universal) to which the capability belongs. That is, is the capability independent of life cycle phase? If so, then it belongs to the universal group. If not, then in which phase is it useful? If the group is requirements or design, update the appropriate Desirability Matrix (to be explained in Section 3) to show the desirability of the capability

Level

2.4.1. Definition. A common sense notion in software development is that a selected methodology should be appropriate to the size of the project. Our significance level concept refines this notion of project size. We developed this concept because projected raw count of source code statements, in itself, is not a sufficient discriminator for choosing one methodology over another. For example, large projects tend to require more formal, verifiable, and complex methodologies, but these same requirements might apply to small projects involving life-critical decisions. The significance level concept examines the software development project from three viewpoints: project considerations, software considerations, and quality considerations. Under project considerations, cost, criticality,

S~cification

289

Technology Selection

and schedule measure the importance the funding agency attributes to the project. Under software considerations, complexity, development formality, and software utility measure the conceptual effort required by the project, and thus measure the significance of the project to its developers. Under quality considerations, reliability, correctness, maintainabi1ity, and verifiability measure the importance end-users attribute to the project. Quality considerations exist other than the four listed above. We incorporated only those considerations that were independent of software category. The following two reasons support this decision. The first reason is simplification; since the guidelines use a manual system, additional quality considerations would have meant the addition of worksheets (as many as 18), one for each software category. The second reason is clarity; the relationship of the unused quality attributes to individuai software categories is not straightfo~ard. Significance level is used to evaluate the practicality of applying specification technology to a project. it is impractical to employ a specification technology if the costs associated with its use outweigh the expected improvement in the delivered software. Budget- or schedule-limited development projects often can not afford the use of a specification technology entailing some combination of extensive training, licenses for software tools and database management systems, expensive graphic wor~tations, or use of a main-frame. As support tools become available on inexpensive personal computers, knowledge and use of specification techniques will spread, thereby lowering the cost of using the technology. A project’s significance level is a weighted average over significance level values for ten considerations. The relationship of each consideration to significance level is explained below. Project Considerations (Funding Agency Viewpoint): 1. Cost. The significance of a project increases in di-

rect proportion to relaxation of constraints on project development costs. A low budget indicates a lower significance than a relatively unconstrained budget. ff a funding agency considers a software project to be important enough to warrant development regardless of cost, then the project is more significant than a project whose budget is limited. 2. Criticality. The significance of a project rises in direct proportion to the criticality of the assignment. A project in which flight crew safety is involved is more significant than one whose failure would be of nuisance impact to its users. 3. Schedule. The significance of a project increases in direct proportion to relaxation of constraints on

scheduling. A project whose schedule can slip in order to produce better software is more significant than one whose delivery date may limit functionality, ease of use, and quality. Software Comiderations (Developer’s Viewpoint): Complexity. The significance of a project rises in direct proportion to the complexity of the solution. A difficult problem whose solution is hard to devise and validate is more significant than a problem whose solution is straightforward and easy to check. Development Formality. The significance of a project rises in direct proportion to the desired level of contractor controls. The stronger the control (and the more formal the review of interim results), the more significant the project. Software Utility. The significance of a project rises in direct proportion to the utility of the application. Software developed as a one-shot feasibility demonstration is less significant than software developed to provide real-time support to a C31 task. Quality Considerations (End-Users’ Viewpoint): Reliability. The significance of a project rises in direct proportion to level of reliability needed. Software that should respond correctly to nominal conditions is less significant than software which must have its faults’ removed as soon as they occur. Correctness. The significance of a project rises in direct proportion to level of correctness needed. From a specification point of view, level of correctness is a measure of how completely the software or its design satisfies project requirements and constraints. Software that is considered acceptable when its operation provides the functionality needed even though it does not meet other constraints (such as a user-friendly interface) is less significant than software whose design must be formally validated against the requirements specification. 9. maintainability. The significance of a project rises in direct proportion to the level of maintainability needed. Software that is not expected to be maintained is of less significance than software expressly developed so that the extent of changes are optimally localized. IO. Verifiability. The significance of a project rises in direct proportion to the level of verifiability needed. Software whose documentation is to be casually ‘We define a fault as an undesirable response to anomalous conditions.

290

M. J. Davis and D. R. Addleman

Table 1. Significance Level 0

Table 3. Significance Level 2

Cost constraints

Cost constraints Criticality Schedule constraints Complexity Development formality

Criticality Schedule constraints Complexity Development formality Software utility Reliability Correctness Maintainability VerifiabiIity

Low Budget, emphasis on minimum cost No criticality assignment Tight schedule Straightforward solution; easy to checkout Few defined requirements; informal development; used locally One-shot; prototype; test software; demonstration software Respond correctly to nominal conditions Functionality met; constraints ignored No maintenan~ expected Documentation in source code

Software utility Reliability Correctness Maintainability Verifiability

Some cost flexibility Mission impact Normal schedule constraints Greater complexity Strong contractual controls; formal reviews Realtime avionics, C3 and C31 software Faults removed ASAP Implementation validated against design specification Impact of changes somewhat localized Full complement of documentation; design documentation updated too

software are examples of projects that would have significance level 3. maintained as comments in its source code is of less significance than software whose documentation is complete (includes requirements through source code documents) and always up-to-date. 2.4.2. Characteristics of each level. We defined four significance levels, numbered 0 through 3. Projects of significance level 0 have least relative importance; of level 3 have most relative importance. Actual projects are a combination of several levels. We devised a method for computing an overall significance level for a project (see Section 2.4.3). A project of significance level 0 has the characteristics shown in Table 1. Test generators, conversion table programs, and trade study simulations are examples of projects that would have significance level 0. A project of significance level 1 has the characteristics shown in Table 2. Editors, compilers, mission and environmental simulators are examples of projects that would have signi~cance level 1. A project of significance level 2 has the characteristics shown in Table 3. Airborne surveillance, early warning, and avionics mission planning are examples of projects that would have significance level 2. A project of significance level 3 has the characteristics shown in Table 4. Nuclear control and life-critical Table No. 2. Significance Level 1 Cost constraints Criticality Schedule constraints Complexity Development formality Software utility Reliability Correctness Maintainability Verifiability

Normal cost constraints Nuisance Impact Some schedule constraints Moderate complexity Normal to strong contractor controls Ground-based software; data reduction; Mission Preparation Software Faults corrected periodically; temporary workarounds provided Functionality and constraints met Predict impact of changes Source code documentation updated

2.4.3. Calculation. The worksheet for computing the overall significance level of a project appears as Figure 3. The process is as follows: 1. Rate the significance level for each consideration. 2. Assign a weighting factor to each consideration; normally, give each consideration a weighting factor of 1 to make them equally important. 3. Compute the weighted average to arrive at the overall significance level. 3. THE SELECTIONPROCESS The selection process merges the methodology-in-thesmall and methodology-in-the-large concerns into four simple steps: 1. Compute the project’s overall significance level. 2. Find the software category best fitting the project. 3. Select candidate meth~ologies using life cycle and software category requirements. 4. Choose a final methodology using the overall signifTable 4. Sienificance Level 3 Cost constraints Criticality Schedule constraints Complexity Development formality Software utility Reliability Correctness Maintainability Verifiability

Cost not predominant factor; relatively unconstrained Nuclear, flight crew safety Additional fault detect requirements will not impact schedule Difficult problem; complex solution; hard to validate Generally contracted rigid controls over development Highly critical applications; possible catastrophic results No faults Design validated against requirements specification Extent of changes optimally localized Requirements through source code documentation always up-to-date

291

Specification Technology Selection CONSIDERATIONS

SL (0, 1, 2, 3)

WElGHT (1 = NORMAL)

Software

Category Table lists each category by name, along with its characteristics, and a general description of the software that falls into the category. Match Table, a two-tiered table. The top tier displays software categories vs. methodology capabilities appropriate to a particular life cycle phase. This top-tiered portion is known as the Desirability Matrix. The bottom tier displays methodologies (in terms of key letters) versus the same methodology capabilities. This bottom-tiered portion is known as the Supported Matrix. Ratings Table provides a methodology’s score for each software category, differing in accordance with whether selection is being made on the basis of the requirements or design life-cycle phase.

PRODUCT (WEIGHT x SL)

COST CRITICALITY SCHEDULE COMPLEXITY DEVELOPMENT FORMALITY SOFTWARE UTILITY

0%

= SUM OF PRODUCT

/ SUM OF WEIGHT =

Figure 3. Significance level calculation.

Figure 4 depicts the four steps and their corresponding tables. 3.1. Step 3-Select

icance level value. Steps 1 and 2 have been described above. Descriptions for steps 3 and 4 follow in the next two subsections. The selection process assumes that definitions and expertise have been organized into four tables: 1. Signijcance Level Table shows the ten considerations and describes the four significance level ratings for each, and lists representative software for each significance level. REFERENCE

Candidate Methodologies

This step chooses candidate methodologies by matching methodologies against desired capabilities for a particular life cycle phase and software category. This match can be done conveniently by setting up a table whose top portion is a Desirability Matrix and whose bottom portion is a Supported Matrix. Figure 5 is an example Match table.

Figure 4. Selection process overview.

STEP

RESULT CONSIDERATIONS

I

I

S/W CATEGORY

Y

/’

SOFTWARE

CATEGORIZE SOFTWARE

cA:BGLoERy

c I

I

/

I

1

METfi;;~~OGY TABLE

.

I

l-t

SELECT CANDIDATES

/

I

CANDIDATE’S I

IDS

1-i

t

CANDIDATE’S

SCORES

,

M. J. Davis and D. R. Addleman

Figure 5. Example match table.

The rows of the Desirability Matrix are software categories and the columns are specification methodology capabilities appropriate for a particular life cycle phase. (A separate matrix is constructed for each life cycle phase.) The marks (xs) identify which capabilities are especially useful for a specific software category. The rows of the Supported Matrix are methodologies and the columns are the same capabilities used for the Desirability Matrix. If a methodology supports or provides a particular capability then the intersection of that methodology’s row and the capability’s column contains a mark. The match step is a matter of visually inspecting the Supported Matrix for rows whose entries are identical to entries in the row for the project’s software category in the Desirability Matrix. That is, the manager finds rows in the Supported Matrix (methodologies) whose marks fall in the same capability columns as the marks in the software category row in the Desirability Matrix. For example, in the Match Table shown in Figure 5,

software category 10 has marks in each capability column, as does methodology F. Thus, methodology F would be chosen as a candidate methodology for software category 10. 3.2. Step 4-Choose

a Single Methodology

from

Candidates

This step winnows the candidate methodologies selected in step 3 by comparing them to an ideal methodology. The ideal methodology has been tailored to the life cycle phase(s), software category, and significance level of the software to be developed. It provides exactly the capabilities desired at a level of support suggested by the significance level of the project. The level of support to be provided by the ideal methodology can be treated either uniformly or nonuniformly across the capabilities. A uniform treatment assigns to each desired capability the same support level as the overall significance level of the project. A nonuniform treatment assigns a separate support level

293

Specification Technology Selection value to each desired capability (constraining those values to be less than or equal to the project’s overall significance level is a good strategy). The set of capabilities desired in the ideal methodology is comprised of two sets: (1) the set of capabilities considered independent of life cycle phase and software category and (2) the set of capabilities dependent on software category and life cycle phase. The second set is derived from the Desirability Matrix appropriate to the life cycle phase(s) in which the methodology is to be used. The comparison of methodologies to the ideal methodology can be reduced to comparison of Methodology Scores for the candidate methodologies. The computation of a Methodology Score (MS) proceeds differently depending upon uniform or nonuniform treatment of support level. Once the MS values for the candidates have been computed, the best jit methodology will have the score closest to zero. An MS value greater than zero means the methodology provides more support power than nominally desired; an MS value less than zero means the methodology provides less support power than nominally desired. 3.2.1. Uniform support level. If you assume each desirable capability is to be given the identical significance level computed for as the project’s overall significance level, then the general formula for computing a Methodology Score (MS) is: MS = SUM - SL * COUNT, where SUM is the sum of the methodology’s ratings for the desired capabilities (note that other methodolgy capabilities are ignored), SL is significance level, and COUNT is the cardinality of the set of desired capabilities. The first term of the formula [SUM] represents the support power the methodology provides. The second term [SL * COUNT] represents the support power an ideal methodology would provide at a given significance level with a uniform rating (the specific SL) per capability. Notice that the correction factor for significance level 0 is 0. This corresponds to the realistic assumption that development of a project with the following characteristics does not need specification technology: (1) low, tight budget, (2) no criticality assignment, (3) tight schedule, (4) straightforward solution and easy to check out, (5) few formal requirements, (6) expected use in a local environment as test or demonstration software, (7) not required to recover from anomalous conditions, (8) acceptance predicated solely on correct functionality, (9) not expected to be maintained, and

(10) documentation confined to source code. This assumption may not be valid for other life cycle phases. The guidelines we provided use the uniform treatment of support level. This allowed us to precompute the MS scores for each software category and for two life cycle cases (paths), reducing the final selection step to a table lookup. Path 1 assumes the methodology will be used in both the requirements and design specification phases; Path 2 assumes the methodology will only be used in the design phase. 3.2.2. Nonuniform support level. The formula for computing the Methodology Score (MS) if the capabilities desired in the ideal methodology are treated nonuniformly is:

is the methodology’s rating for a particwherermethdology ular capability and ridcalis the rating given that capability in the ideal methodology. Note that the MS sum only includes the capabilities desired in the ideal. Thus, methodologies are not penalized for providing capabilities other than those nominally desired, which is reasonable since undesired capabilities can be ignored. The nonuniform support level approach is suitable for use by experienced managers since it effectively allows them to selectively weight capabilities.

4. CONCLUSIONS We have introduced two concepts, methodology power and significance level, characterizing them as methodology-in-the-large considerations. We have described a scoring mechanism based on the notion of an ideal methodology, which allows us to merge conventional methodology-in-the-small considerations of software categorization and life-cycle requirements with the more practical methodology-in-the-large considerations. The guidelines (1) accomplish the project objective of providing Air Force system and software develop ment managers with an uncomplicated method for matching existing methodologies to project needs; (2) map the subjective evaluation of methodologies onto an objective framework, thus reducing dependency on specification technology expertise, a scarce resource; (3) describe a table-driven approach that is easily computerized; (4) supply a uniform support-level treatment that enables a project manager of limited experience to select a specification methodology; and, (5) supply a nonuniform-support-level treatment that allows the experienced project manager to apply knowledge gained from past projects to the selection process.

294

REFERENCES 1. E. Yourdon and L. L. Constantine, Structured Design, Yourdon Press, New York, 1975. 2. Douglas T. Ross and Kenneth E. Schoman, Jr., Structured Analysis for Requirements Definition, IEEE Trans. Software Eng. SE-3, 6-15 (January 1977). 3. M. W. Alford, A Requirements Engineering Methodology for Real Time Processing Requirements, IEEE Trans. on Software Eng. SE-3, 60-69 (1977). 4. D. R. Addleman, M. J. Davis, and P. E. Presson, Specification Technology Guidelines, RADC TR-85-135, Rome Air Development Center, Rome, NY, 1985.

M. J. Davis and D. R. Addleman 5. D. R. Addleman, M. J. Davis, and P. E. Presson, Specification Technology Guidelines Final Report, RADC TR85-0075, Rome Air Development Center, Rome, NY, 1985. 6. F. DeRemer and H. Kron, Programming-in-the-Large Versus Programming-in-the-Small, IEEE Trans. Sofiware Eng. SE-2, 80-86 (June 1976). 7. P. E. Presson and V. Thomas, Software Test Handbook Final Technical Report, RADC TR-84-53, Rome Air Development Center, Rome, NY, 1983. 8. C. W. McDonald, W. Riddle, and C. Youngblut, Methodman II, Institute for Defense Analyses, Alexandria, VA, Nov 1984.