Software design for CAD Keith D Baker The concepts of structured programming are discussed in relation to the overall software design process. A methodology is described which allows a rigorously disciplined approach to be made to software design. Experience has shown that this leads to a more manageable product which can be subsequently modified and extended. Further, the self-documenting style improves the understanding of the program function. The introduction of an algorithm description language eases the transition from program specification to implemented code. The well formed structure imposed on the design reduces complexity. These factors are especially important when the implementation is at assembly/eve/. Although this is applicable to all areas of software design, an elementary example of a graphics application has been used to illustrate the methodology. It has become a fact of life that any software product remaining in use is subject to a process of continuous change. There are several reasons for these changes. First, the initial implementation will contain errors which may be attributed directly to the coding process. Removing these errors is a relatively painless task eased by the program development aids available if the implementation is in a high-level language. Unfortunately, the majority of errors brought to light during the implementation and testing stage of the product are of a logical nature. The operation of the system does not meet the original specification, assuming such exists, or unpredictable events happen from time to time causing a discontinuity in operation or a complete system failure. In the process of removing logical errors, further errors are uncovered. Removing these 2nd-order errors often causes further errors to be thrown up, and so on. If the functional specification remains constant, then it can be expected that this process will be convergent. However, the fact that an error does not occur under a given set of operational circumstances in no way guarantees its absence from the system; it is merely waiting for the appropriate combination of circumstances in order to happen! The previous assumption of a constant functional specification for a product is not acceptable. In practice, all but the very smallest of software developments represent a substantial investment on the part of the manufacturer, and so there are considerable economic pressures for a product to be modified or adapted from one application to another. The functional specification may have to be extended to meet the requirements of a new user. If this involves major functional changes, a severe degradation in system performance can be expected. Further, it is likely that extensions and modifications to a system will be School of Engineering and Applied Sciences, University of Sussex, Brighton, Sussex, U K
volume 9 number 4 october 1977
carried out by programming teams who were not responsible for the initial design. It has been estimated 1 that 75% of the current programming community are employed on a program-maintenance activity, and the percentage is increasing. It is, therefore, instructive for all concerned with the development of software systems to consider the findings of a current research program directed at the analysis of the programming process. Lehman et al. ~-4 have carried out detailed studies of a number of software products extending over a period of several years. In this work, a number of global variables have been identified that allow the programming process and the superimposed projectmanagement process to be modelled as the interaction between two complex dynamic systems. The programming process itself is seen as one of evolution constrained by its environment. By considering some of the observations made in these studies, it is possible to gain greater appreciation and understanding of the principles underlying current trends in software-design methodology. The work reported has concentrated on three software products whose sizes span some two orders of magnitude, from a large multipurpose operating system and a transaction processing system to a small executive system. The first two systems were implemented at assembly level, while the last made use of a high-level language. In each case, the manufacturer and environment were different, and yet the similarity of the observations implies a common set of underlying processes. Each system has been quantified by introducing measures of size, time scale and complexity. Size is measured by the number of modules of code, and, although no precise definition of module exists, many systems are designed by subdivision into convenient size blocks of code. The time scale is measured in terms of the release sequence number. The choice of release sequence number provides a useful link with the emergence of a new functional specification of the system. Between one release and the next, work is carried out on the system to extend its capability. During this time, some of the modules are modified or extended. A change in one module may require changes in others, and hence a modification to the functional specification may require several modules to be handled. Lehman uses this to quantify the complexity of the system. That is, complexity is related to the connectedness of the system, and connectivity may be measured by the interaction between modules. It is therefore intuitive to expect that a modification carried out on a highly connected system is likely to require changes in many modules, whereas, in contrasts, we would expect correspondingly fewer changes to a less connected system. Only those observations relevant to this paper are reported here. For further details, the reader is directed to the original work. As would be expected, the size of
275
each product was seen to increase with release sequence number. An average natural increment was observed to occur between releases. Increases above the average between two releases caused a subsequent decrease at the next release. For the largest product studied, it was found that, in every case, where the growth rate greatly exceeded the natural average, a substantial decrease in reliability and performance resulted. Similarly, the number of modules handled between releases was seen to increase. If the complexity of the system was related to the fraction of modules handled, the increase in complexity was seen to be quadratic. Further, the increase in complexity was seen to be the reason for the apparent limit to growth of the product despite the availability of resources to implement the required changes. Further extension of the product capability required complete restructuring of the design. To summarize, since the specification of a software product is generally incomplete, inconsistent and ambiguous, the eventual product is subject to changes. Further, because of the capital investment it represents, there are strong economic factors requiring the continual modification and extension of the product. In general, the changes carried out lead to an increase in complexity which eventually produces an unmanageable system.
Level 0
Level
I
Level 2
Level
3
~4 \ Ill
IIII
II
II
II II
SOFTWARE DESIGN It is usual for software production to pass through the stages of specification, design, implementation and test in a strict sequence. To produce a design free of logical errors, it is necessary to remove all ambiguities and inconsistencies from the specification. However, in order that the specification should convey sufficient information for the product to satisfy the predetermined requirements, it must be couched in terms that are natural to the application it describes. Natural language provides the flexibility required for the description of a set of requirements. Unfortunately, in this sense, natural language is full of ambiguities, allowing many inconsistencies to be included. Because of this, the whole design process cannot be sequential, but must rather be iterative. The starting point is the natural-language specification. On completion, this specification is translated to coded representation suitable for execution on a computer. The language of the implementation may be either a high-level language or an assembly language. The latter is currently predominant in the industry. In comparison with the richness of expression available in the natural language, the implementation language contains severe constraints, allowing only precise statements at a relatively low level. This is one of the major difficulties in the design process. On the one hand is the description of the required functions in a high-level natural language, and on the other is the lowlevel implementation o f the algorithms. The two descriptions of the same set of functions are orders of magnitude apart. For most people, the intellectual task of translating the specification language to the implementation language in a logical, error-free manner is extremely difficult. This was first recognized by Dijkstra s, and later analysed by Wirth 6. To reduce the difficulty, it is necessary to divide the task into a number of intellectually manageable subtasks to bridge the gap between the two extremes of descriptions. The requirements of the software design process may now be stated. In the absence of an analytic theory to support
276
,
,,i
Figure 1. Conceptual view of the stepwise refinement process
~
--Illlll
I
I 1
Figure 2. Modification of the design showing decomposition to be replaced
computer-aided design
the process, a disciplined methodology is necessary. Because the process is iterative, and also because of economic factors, there is a very strong requirement for the design to be modifiable. That is, throughout the design phase and its operational life, the structure of the product must allow for easy modification. A prerequisite of any modification necessitates an understanding of the design. It must also be remembered that the modification will, in general, be carried out by people not involved with the original design. The danger of subjective interpretation of poorly understood designs is very real. In consequence, the quality of the design becomes very dependent on the history of the modifications carried out. The resulting increase in complexity detracts from the understanding of the evolving system.
Top-down stepwise refinements The methodology that has become known as structured programming stems from the work of Dijkstra s and Wirth 6. With this method, the design of software becomes a rigorously ordered and staged transition from the high-level specification to the low-level implementation. The initial specification is divided into a number of functionally related subsystems. Each subsystem has its associated specification which itself is a subset of the total system specification. The reasons for a specific subdivision being adopted should be completely documented as part of the design process. Each subsystem may be considered to contain functionally related tasks such that the degree of interaction between these tasks is greater than that between tasks of different subsystems. Proceeding further, each task is divided into a number of steps which specify the processing to be carried out. It is important to specify the processing involved at each step in a language appropriate to the level of detail being considered. No attempt to use a specific computer language should be made until the refinement of a particular step is sufficiently detailed to allow an almost one-to-one correspondence between the description of a substep and a statement in the language. The top-down refinement process structures the design according to a well defined hierarchy. Conceptually, the process may be represented as shown in Figure 1, where three levels of refinement are considered. At the heart of the method is the need to divide the design into easily comprehensible units. This not only helps to ensure that the program does what it was supposed to do at the outset, but it also demonstrates the fact to other programmers. A program that can be understood stands a far greater chance of being modified correctly. Stepwise refinement, in fact, forces the design into a rigorous modifiable structure. Where a modification or extension of the specification is necessary, the appropriate nodal point in the decomposition is identified, as shown in Figure 2, and the subsequent decomposition steps are replaced by those pertinent to the new requirement. In a production environment, well documented programs are essential. From this point of view, the stepwise refinement process is substantially self-documenting. The important point is that the documentation is constructed as part of the design process. It therefore forms a complete integrated description of the product rather than a sketchy summary constructed after implementation of the algorithm. In principle, a new programmer joining an established team should be in a position to carry out an extension to an existing program.
volume 9 number 4 october 1977
Allowed control structures At some stage in the decomposition process, it is necessary to introduce a programming language. It has been advocated that this should be left to the time when the transition can be made on a ohe-to-one correspondence basis. This easesthe transition process, but delays the introduction of constraining factors on the design more. Further, if the implementation is to be at assembly level, the transition will be later than if the implementation is to be in a highlevel language. Ultimately, the steps in any algorithm are constrained by the instruction set of the processor on which it is to be executed. Therefore, eventually, the computation must be expressed in terms of assignments, data movements, conditional and unconditional branches etc. Programming at assembly level allows the programmer freedom to use these primitives with little constraint. However, the primitives available in a high-level language can be either general-purpose or tailored to a specific • application. In either case, restrictions on the data types, assignments, flow of control etc. relative to those available at assembly level are placed on the programmer. Of particular interest are the control structures. Some assist the program design process, and others increase the complexity leading to an unmanageable design. To decide which control structures should be used and which should be avoided, consider what is necessary to maintain the structure enforced by the stepwise refinement process. Taking a flowgraph approach, the decomposition of a node may be represented as in Figure 3. Each node, or box in the flowgraph, is decomposed using the primitives available at that level in the decomposition. Since the decomposition replaces the statement represented by the box, and since the latter has a unique entry and exit, the decomposition steps must have a unique entry and exit. For this to be so, the control structures employed in the decomposition must be restricted to those having a single entry and single exit point. In this way, the objective of modifiability of the program is maintained, since the plug-in nature of the decomposition steps is retained. Those control structures allowing multiple input and multiple exit points destroy the desired structure.
I
tfi
......
a
fl b Figure 3. Decomposition of a node, (a) single-entry, sing/eexit, (b) sing/e-entry, multiple-exit
277
[ ] q
t Figure 4. Basic control structures generating nonstructured programs
In principle, any single-entry, single-exit control structure may be allowed in the decomposition process. However, it has become customary to consider the basic set of three structures, namely,
tested on the host machine. Subsequently, the decomposition may be continued to make the translation to the assembly code for the target machine. As well as knowing the set of control structures suitable for the development of structured programs, it is also useful to know the types of structure that destroy the objectives. It has been pointed out a that the basic set of control structures needed to generate nonstructured programs is that given in Figure 4. McCabe has defined a measure of complexity of a flowgraph related to the number of nodes and edges. It has been shown that the complexity increases with the connectedness of the flowgraph and with the unstructuredness of the design. Comparable conclusions have been reached by Martin 9 when considering the various types of control structures taking two, three, four etc. nodes at a time. He found that no new control structures of higher order than those shown in Figure 4 can be constructed without a sharp increase in flowgraph complexity. It was concluded that a natural break in the complexity of these structures exists. Even so, the simple 2- and 3-component structures can be used to express the more complex control flow arrangements without the introduction of further control variables. Using McCabe's definition of complexity, which relates to the intuitive notion of complexity, it can be shown that, by consistently using the single-entry, single-exit control structures, the complexity of the resulting program can be reduced to a program of unit complexity, whereas unstructured programs cannot. The complexity is related to the number of paths through the program. This is important during the testing phase, when all possible paths resulting from the allowed input data set must be examined.
SEQUENCE IF - THEN - ELSE WHILE-DO
or
REPEAT-UNTIL
as being fundamental in the construction of structured programs. In fact, it was shown by Bohm and Jacopini 7 over a decade ago that these control structures are sufficient to express any flowgraph algorithm. High-level languages containing these structures are therefore those most suited to the task of structured programming. When the implementation language is at assembly level, then, of course, the control structures are designed by the programmer in the course of designing the algorithm. Because of the flexibility available at this level , the nonstandard control flows can be implemented. Some restraint on these practices is necessary if we are to maintain the previously stated objectives. This can be accomplished by the introduction of artificial allowed control structures during the decomposition before the final transition to assembly code. In effect, an algorithm description language (ADL) is used which contains the basic set of control structures. The algorithm description language need not be a real computer language; it merely serves the purpose of introducing restraining factors into the design. On the other hand, real computer languages can profitably be used to describe an algorithm that must eventually be translated into an assembly language. Suppose, for instance, the algorithm is expressed in a language for which a compiler and machine are available. This machine is distinct from the target machine on which the eventual assembly-code algorithm is to be implemented. The decomposition can therefore proceed to the high-level language stage and be
278
I LLUSTRATIVE
APPLICATION
Consider a situation where graphics hardware provides for interaction with a human operator via a VDU terminal, a storage display and a set of analogue inputs obtained from potentiometers. A program is to be designed to provide the facility for an operator to build interactively a scene composed of one or more rectangular solid objects. The size and position of each object are to be defined via the analogue inputs. Two modes of display are to be available, allowing the scene to be viewed in either plan or perspective projections on the storage display. The operator must be able to select repeatedly either mode, and within each mode a further set of optional functions may be selected. The operator is able to direct the processing via commands entered at the keyboard. Although this is only a brief description of the function of the program, this is taken as the starting point for the design process. At the highest level in the decomposition, the following steps may be identified: • •
•
initialize the system, construct the required scene, object by object, in plan mode or perspective mode or both until scene is complete, terminate the program.
To proceed to the next level of decomposition, each of these steps is taken in turn and further subdivided. Only the decomposition of the second step is discussed here.
computer-aided design
Construct the scene • Determine mode of operation requested by the operator. • Determine processing requested by the operator. • Carry out requested processing. • Continue servicing operator requests until mode termination received. Each of the above steps is now separately decomposed.
Determine mode of operation • Get command from keyboard. • Analyse command. • Identify required mode. • Initialize mode.
Determine processing requested • Get command from keyboard. • Analyse command. • Identify requested processing.
constrained by the representation chosen. Conversely, if the data of the problem exhibit a definite structure or relationship between data items, then the algorithm processing this data should be similarly structured. This is the point stressed by Jackson z°, and it allows easier program modification upon a modification of the data. For the sake of brevity, the discussion of alternatives that is necessary at each level in the decomposition has been excluded. A fully documented design should contain such a discussion, giving the reasons why the particular decomposition was selected. If the steps listed in each refinement are carried through into the final listing of the program as comments, then these provide a necessary crossreference aid between listing and the design documentation. Such a device is of great assistance when modification of the program is necessary. The ADL constructs which are restricted to the three structures SEQUENCE IF-THEN
Carry out requested processing • • •
Specify new object, or display all objects, or display all objects and modify specified one.
Continue processing unless mode terminates •
Iterate until a mode terminates.
It has not been mentioned if the final program is to be implemented at assembly level or in a high-level language. The decompositions of the first two steps are sufficiently well defined to allow a transition to an ADL or high-level language. Further decomposition of the steps comprising the third step is necessary to define the processing to be carried out.
Specify new object • Obtain object identification. • Construct internal representation of object.
Display all objects • Select an object. • Transform to screen coordinates. • Remove hidden lines. • Display on screen. • Repeat for remaining objects.
Display all objects and modify one • Specify object to be modified. • Select an object. • If specified object then modify. • Transform to screen coordinates. • Remove hidden lines. • Display on screen. • Repeat from remaining objects. Each of the above substeps must be further divided. It can be seen that a point is reached where some decision concerning the structure of the data must be made. First, a data structure to represent a single object is necessary, and second, the data pertaining to each object in the scene must be represented. Notice that, once a data structure has been decided, all subsequent processing of the data is
volume 9 number 4 october 1977
- ELSE
WHILE - DO are now introduced. The algorithm is iterative, continually processing commands injected by the operator. This applies to the overall processing and the processing within a selected mode. The processing within the mode will continue until the operator enters a mode-terminate command. At the level of detail shown in Figure 5, it is a relatively easy task to translate the algorithm to a pseudo-
Construct the scene Determine the mode of operation Get command from keyboard Analyse command Identify required mode Initialize mode Determine processing requested Get command from keyboard Analyse command Identify requested processing Carry out requested processing Specify new object, or Obtain object identification Construct internal representation of object Display all objects, or Select an object Transform to screen coordinates Remove hidden lines Display on screen Repeat for remaining objects Display all objects and modify one Specify object to be modified Select an object If specified object then modify Transform to screen coordinate Remove hidden lines Display on screen Repeat for remaining objects Continue until mode terminated
Figure 5. Partial decomposition of construct.the~cene step
279
%
DETERMINE MODE OF OPERATION GET.COMMAND ANALYSE.COMMAND IF PERSPECTIVE.MODE THEN BEGIN INITIALIZE.MODE
%
DETERMINE PROCESSING REQUESTED GET.COMMAND ANALYSE.COMMAND
%
CARRY OUT REQUESTED PROCESSING IF
SPEC.NEW.OBJ ECT THEN BEGIN OBTAIN.OBJ ECT. I D CONSTRUCT.INT.REP. END ELSE IF DISPLAY.ALL.OBJ ECTS THEN BEGIN WHILE ANOTHER.OBJECT DO BEGIN SELECT.OBJ ECT TRANSFORM HIDDEN.LINE DISPLAY END END ELSE IF DISP.ALL.MODIFY.ONE THEN BEGIN WHILE ANOTHER.OBJECT DO BEGIN SELECT.OBJECT I F SPEC.ONE THEN MODIFY TRANSFORM HIDDEN.LINE DISPLAY END END ELSE IF MODE.TERM THEN TERMINATE.MODE
END ELSE IF
%
PLAN .MODE THEN BEGIN INITIALIZE.MODE
DETERMINE PROCESSING REQUESTED GET.COMMAND
Figure 6. Decomposition in A DL form
280
computer-aided design
ADL, as shown in Figure 6. The selection of the processing carried out within a mode has been accomplished using the binary selection I F - THEN - E L S E repeatedly. In some cases, it would be more appropriate to use the CASE construct. This is especially so if the algorithm is to be coded in a high-level language containing the CASE facility. The inclusion of comments (signified by %) provides the previously stated identification points in the program listing. In the pseudolanguage, the program is still in a readable form. The predicates appearing are not necessarily simple Boolean variables, although they must produce such. They may, for instance, require the execution of a procedure to produce their value (TRUE, FALSE). However, it is now a reasonably straightforward task to translate the algorithm in its present form to a high-level language available on the target machine. If the language has constraints concerning the possible data structures, these are encountered at this stage. Any subsequent decomposition is bound by these constraints. Also, any subsequent iteration in the design, and several may be necessary, will involve backtracking for one or more levels and redoing the decomposition. Suppose, however, that the implementation is to be made at assembly level. The need to" superimpose a structure on the code is even more important in this case. The unconstrained design encourages the use of multipleinput, multiple-output control structures whose range can encompass widespread regions of the program. In many cases, the inclusion of these effects is not intended by the programmer, leading to unknown, and hence untested, paths through the algorithm. The problems encountered when modifying such programs are painfully apparent to all who have been faced with such a task. This applies even more when the program has been designed by another person and, in the worst case, consists of a monotonous sequence of seemingly never-ending assembly statements. Superimposing the ADL structure allows sections of code to be understood, i.e. there function is explained in the ADL statements. Further, sections of code may be modified or replaced completely without unforeseen consequences to other parts of the program because the essential single-entry, single-exit nature of the structures is maintained.
To proceed from the algorithm description given in Figure 6 to an assembly-level description, several more levels of decomposition are required. Each ADL statement must be decomposed into finer and finer steps until an almost one-to-one correspondence with the proposed assembly language statements may be made. In doing this, the finer allowed structure is maintained by the use of the three appropriate control structures. It is not proposed here to continue the stepwise refinement to an assembly language. However, the process for a small part of the ADL form is shown. Suppose we consider the statement WHILE ANOTHER.OBJ ECT DO The predicate ANOTHER.OBJECT necessarily must take the values TRUE or FALSE when evaluated. The decomposition may proceed as follows: ANOTHER.OBJECT BEGIN • If the CURRENT.OBJECT.NO greater than MAX.NO. OBJ ECTS • THEN predicate is false • ELSE predicate is true END If the CURRENT.OBJECT.NO greater than MAX.NO. OBJ ECTS • Locate MAX.NO.OBJECTS • Locate CURRENT.OBJECT.NO • Subtract CU RRENT.OBJ ECT.NO. from MAX.NO. OBJ ECTS • Test the result THEN predicate is false • If result negative set predicate false • Go to END of test ELSE predicate is true • If result zero or positive set predicate true At this stage, it is possible to translate the ADL statement into the appropriate assembly code. The structure is
% I F CU RRENT.OBJ ECT.NO greater than MAX.NO.OBJ ECTS Locate MAX.NO.OBJ ECTS LXI H, MAX.NO.OBJECTS MOV A, M LXI H, CURRENT.OBJECT.NO ; Locate CURRENT.OBJECT.NO Max.No.Objects- Current. SUB M Object.No. positive result JMPTRUE % THEN
predicate is false LXI H, ANOTHER.OBJECT MVl M, ¢¢H iMP END
% ELSE
; setvalueofANOTHER.OBJECT ; to zero
predicate is true
TRUE: LXI H, ANOTHER.OBJECT MVIM, ¢1H END:
; set value of ANOTHER.OBJECT ; to one
Figure 7. Decomposition o f the predicate ANOTHER. OBJECT
volume 9 number 4 october 1977
281
maintained by including the ADL as comments. The form of the code is as shown in Figure 7. The variable ANOTHER.OBJECT takes the values 1 or 0, and this will be tested by the portion of assembly code to implement the WHILE - DO construct. Although this is only a simple indication of the form of the complete code listing, it is still possible to read the listing and understand the various steps in the algorithm. Clearly, the decomposition of statements such as TRANSFORM and HIDDEN.LINE will involve a large number of assembly-language statements. It is maintained, however, that, by adopting the suggested methodology, a manageable product is produced.
CONCLUSIONS The cost and demand for software are currently increasing. In the absence of automated software design techniques, the economics of the situation demands that software should be modifiable and extendable. Further, there is a growing need for increased software reliability. The influence of computers in everyday life has reached the level where confidence in the correctness and reliability of the automated function is becoming psychologically important. The production of reliable error-free software is currently a difficult task. The subsequent modification or extension of the software can be even more difficult. The analysis of software projects has shown that most errors can be attributed to the logical design process. The cost of removing errors from a software product is very dependent on the stage of development reached in the life cycle before the error is detected. Errors detected in the design stage are far less costly to correct than those detected when the product has been produced and released. Logical errors in the design process can be attributed to the complexity of translating a program specification into the final coded form. By adopting a strict disciplined methodology, it is possible to reduce significantly the number of errors introduced. This is accomplished by adopting a top-down stepwise refinement of the overall specification. At each stage in the process, the design is divided into a number of intellectually manageable tasks. The conceptual transition between specification and code is thus divided into a number of smaller conceptual transitions between each level in the decomposition. A hierarchical structure is forced on the design. The modification and extension of an existing program is greatly simplified. The relevant part can be easily identified and extracted without fear of unforeseen repercussions in distant parts of the code. In effect, the program function can be understood, and, further, it becomes meaningful to other people. The development of any
282
scientific discipline requires that new work should be based on the results of others. An essential part of the proper development of software engineering is the production of
comprehensible software. Finally, a great improvement in documentation is obtained, since it is raised from an afterthought to part of the creative design phase. The method is equally applicable to cases of high-level language and assembly-level language developments. In the latter, the superimposed structure greatly enhances the development. Moves towards the production of specification languages are being made in software engineering research. Programs are currently available for structuring flowgraphs, and these have been applied to general FORTRAN programs. However, we are still a long way from automated software production. In the meantime, any development to support the production of logically correct and reliable software will help to bring software engineering to maturity. REFERENCES I
Mills, H D 'Software development' IEEE Trans. Software Eng. Vol SE-2 No 4 (1976)
2
Lehman, M M and Parr, F N 'Program evolution dynamics and its role in software engineering and project management', European Computing Conference on Software System Engineering (I 976)
3
Leahman, M M and Parr, F N 'Program evolution and its impact on software engineering', 2nd International Conference on Software Engineering, San Francisco (1976)
4
Belady, L A and Leahman, M M 'The evolution dynamics of large programs', IBM Research Report, Yorktown Heights, New York (1975)
5
Dijkstra, E W 'Notes on structured programming', EWD 249, Technical University of Eindhoven, Netherlands (I 969)
6
Wirth N 'Program development by stepwise refinement' Commun. ACM Vo114 (1971) pp 221-227
7
Bohm and Jacopini, 'Flow diagrams, turning machines and languageswith only two formation rules' Commun. ACM Vol 9 (1966) p 5
8
McCabe, T 'A complexity measure' IEEE Trans. Software Eng. Vol SE-2 No 4 (1976)
9
Martin, J J 'The natural set of basic control structures', ACM Sigplan Notices (1973)
10
Jackson, M A Principles of program design Academic Press (1975)
computer-aideddesign