Building Reliable Software for Spacelab

Building Reliable Software for Spacelab

Copyright (0 IFAC Safecomp '83 Cambridge, UK 1983 BUILDING RELIABLE SOFTWARE FOR SPACELAB P. J. Robinson and R . K. Atkins European Space Agency, ...

1MB Sizes 248 Downloads 189 Views

Copyright (0 IFAC Safecomp '83 Cambridge, UK 1983

BUILDING RELIABLE SOFTWARE FOR SPACELAB P.

J.

Robinson and R . K. Atkins

European Space Agency, Noordwijk, The Netherlands

Abs tract. This paper describes the methods t ha t have been used to bui l d quality and reliability into Space lab Software. Th is lar ge software project was run by a consor ti um consisting of 5 companie s i n 4 countries , wo r king wi t h staff of 9 national ities, to produc e so ftwar e which is now being used t o contr ol the European des igned and built space laboratory Spacelab , which forms part of NASA Space Transpo rtati on System. The software has now been delivered to NASA in t h e USA; it has passed Design Certification Review and it is being used at Kennedy Space Centre to prepare for the laun ch of the first Space lab mission in Sep t ember 1983 in the Space Shuttle. Keywords. Computer software; re l iability; qua l ity assurance; software verification; spacevehi c le software.

INTRODUCTION

This paper describes the Software Quality Assurance Programme in the environment of this p r ojec t.

Spacecraf t software has to be reliable, what ever the extra cost involved. When the spacec raft is manned, the need i s obvious . But the s uccess of the mi ssion as well as extens ive and complex spacecraft testing also depend on the abil ity of the sof t wa re to continue performing without serious failure for many months. Software for Spacelab , a European space l aboratory , was therefore bui lt us in g the full Software Engineering approach (requirements , design , code , test and ve rify) with the support of a Software Quality Assurance programme t o monitor and audit for adherence to standa r ds , tracking of non - conformance , verifica t ion and accept ance.

THE SPACE LAB SYSTEM Space lab is one of the principal oayload ca rr iers fo r NASA's Space Transportation System (STS). The European Space Agency (ESA) is responsible for the design , develop ment, manufacture and delivery of 2 flight units and supporting equipment , including flight and ground software , de veloped through a contract with a European industrial con sortium led by MBB/ERNO of Bremen, West Germany, as prime contractor .

Spacelab Fig.

1.

Spacelab

153

P.J. Robinson and R.K. Atkins

154

The first flight of Spacelab is scheduled for September 1983. The first basic configuration, a manned laboratory offering space research facilities to experimenters (see Fig. 1), will spend seven days in low earth orbit in the cargo bay of the Space Shuttle. The second configuration, that of an unmanned pallet train, on which is mounted a sophisticated Instrument POinting System ( IPS), is scheduled to be launched in 1984 for a similar mission. The pallets can be used independently, or in conjunction with the manned module. Inherent in the Space lab design is reusability, with a lifetime of 50 missions of typically 7-9 days duration spread over 10 years. Spacelab flight hardware comprises structure, environmental control, electrical power and distribution, Command and Data Management (CDMS) and experiment support facilities. Space lab software is provided for the operation of Spacelab on-orbit using the CDMS on-board computers (dedicated to subsystems and experiments), and for ground check-out via the Electrical Ground Support Equipment (EGSE). Space lab software consists of 8 subsystems as defined in Table 1. The first five software subsystems were integrated progressively at the prime contractor prior to delivery to NASA, who then incorporate the remainder into mission specific configurations.

The total effort involved in producing this set of software was more than 400 man years, primarily because of the interfaces between software and hardware, the interfaces between software and software, the interfaces between contractors, the lack of off-the-shelf tools from the start of the project.

SOFTWARE QUALITY ASSURANCE ON SPACE LAB A major activity, and contributor to the success of the programme, was the Software QA activity. The objectives of this activity were to ensure that: 1. 2.

3.

the Droduct conformed to detail software program standards, all Space lab software was verified in accordance with the software requirements specifications, all software accepted by ESA was internally consistent, complete, released, tested, verified and properly identified.

To accomplish this task, the Software QA programme was conducted in accordance with the ESA approved Software QA plan which covered the following tasks across the multinational consortium: a)

detailed sub-planning by each subcontractor,

b)

establishment of software standards and procedures,

Software Title

Contractor

Language

Computer

EGSE operating system

BTM, Belgium

Assembler, HALls

Mitra 125

GOAL interpreter

Kampsax, Denmark

HALls

Mitra 125

Subsystem computer operating system

Matra, France

Assembler

Mitra 125 MS

Data reduction

Kampsax, Denmark

Fortran

Mitra 125 IBM 370

Support software

ERNO, Germany

pL/I

IBM 370

Experiment computer operating system

IBM, USA

Assembler

l1itra 125 MS

IPS software

Dornier, Germany

HALls

Mitra 125 MS

Experiment software

Many

Assembler

l1itra 125 MS

Table 1

Building Reliable Software for Spacelab REQUIREI1ENTS DEFINITION

c)

requirements and documentation reviews to ensure consistency and testability,

d)

Unit Development Folder (UDF) auditing to review coding standards & development test results,

e)

software baseline control monitoring to preclude unauthorised changes,

f)

configuration control,

g)

software problem reporting and processing

h)

participation in Software Control Boards,

i)

verification test/plan/procedure review,

j)

verification test witnessing,

k)

verification test results review,

1)

acceptance data package preparation and review.

One of the first major tasks of the project was to define, in detail, the Software Requirements Specifications, derived from the overall Space lab System Requirements kept under the joint control of ESA and NASA. Through successive refinement, lower level software requirements were formulated for on-board software, EGSE software, support software and the RTSF. These requirements were reviewed by members of the software consortium, ESA and NASA, and were finally base lined early in 1978. Each requirements was uniquely identified for traceability, and after duplicates and unverifiable requirements were removed, a Verification Control Document (VCD) was created for each requirements specification to identify the verification method (review, analysis, module test or integration test) and the corresponding test procedure. The VCD was then used to track, record and control the individual verification of each requirement.

This may seem to be a formidable activity, but over a period of 7 years, developing a software programme of the size and complexity of Spacelab, such tools are necessary to produce reliable software. An important factor is that the Software QA staff in each company reports to QA head rather than to the Software Manager. With the backing of the customer at each stage, they are then more easily able to ensure adherence to the QA plan that has been approved.

3. 2.2 . 11.2

155

Figure 2 shows a paragraph of a requirements document with additional identifers [A] to [FJ, and the corresponding entrie s in the VCD. An example shows that requirement 3.2.2.11.2 [A] was tested at subsystem level by contractor B, and was tested at system level in Test Case 2.2., approved on 17. 9 . 8 0

Initial MMU Load

[A] L&V shall support the initial load of SL MMU by using the MDM S/W mode for uplink

and

buffers

of

PCMMU

for

8 k words

downlink.

[B]

whi ch

spread

are

The data transfer shall over

be handled

consecutive MDM messages .

in [C]

During uplink of data, L&V shall be able to receive the previous data block from PCMMU and perform a bit by bit comparison.

[0] If an error is detected by

this

contains

comparison,

uplinked again.

the [E]

total A total

data

block which

MMU flight/ground

shall be performed within 1.6 hours .

this error shall

be

L & V comprising 800 K words

[F] Spacelab monitoring (MSU data only)

and Keyboard control shall be maintained during initial MMU load .

l ' l rl,JlI~En[ : ~ T.;

....... ..

u t1\..llIlr IT

..............

'o'F ,IlFtC. LlVll 'H[Tllnv 'UR-

.1)IlC"r>Ar.r.

P,I,I-!M: ilA,r l l

"I
3'C:;T~u

.. ..

:"urI' Il IIT .. 1'ltl'l:"CT"

•••••••

:;Y"Tl'! LEVfL

u"

'V[kH H .• p(J"d~r P'I ' _lll_Oo)All (r'HT rl .(1",\1-'1 le fur

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . _ •••••

•••

tt)·.·

Rp· ... f'I(~

..................................................................................................................................... ".U

." .,

... ...

., oS

q," ." ., " ."q .,

..

1.Z.2.11.1

U

TI TI TI TI

" " .. ,

~

U

'.2.L'.II • .!

U

Fig. 2.

(",)

..

(;,il)

T2

. . . IUII ' l\lR{S)

".212.0

..

1'1.\11

17 .O".flfl 1'17 .fln

r- I-m v r '.' '.' ,Ir:. ,., ,., lIH,r,r , .' q,r r .., n vrj . l,n"rT ""rr ' .' 1' r.1ITt ,r I'. '.' 0' ,n, "" Tr TT

1. /1.,

J.". l . I ). ~ J.2.2.' \ .

TI

.. ()

{J~.

J.2.1.11·z '." . 2.11 • .!. J.Z . l . I I . t!. '.<'.2.11 . .?

(:.is)

"" "

." "

'.' ,., '.' '.' ,.,

17.n 'I./)n 17 ./l'l.Ii"

17.n 'I.II11 I'.(lq,flt' IIIN·rR-)'O-I.I-I'IJ

17,n'l.nn 0'';.

(I').

en

'!~

TT Tr

~,

TJ J'-

AI'

Tltr

H ' r'I·:nv ' lI

trl.(l' ) 4 J .

AI 1'1 11\1'1 '

(1I( i1 .? ,,)

Tl'r

~, ' rl ' ('1 Vrr

Requirement Document and corresponding Verification Control Document Entries

.

"'I ' rl "r'vI'I ' Ilk"'?"). Al'rl :n .... rl.' (T~RZ?) • AI

.HI P

(T/n l l}) ,

nol

(Tll n;n. (Tl,l·Ul.

hrf'DlI v tn

ClD!!?!l.

P.J. Robinson and R.K. Atkins

156

at Test Review Board number 22. Requirement 3.2.2.11.2 rE) was verified by Analysis in document MIN-ER-70-4-81 which calculated the time for a 800K word transfer from a test on a 64K word transfer.

DES IGN REVIEWS The next staqe in the Software Life Cycle is Design. Two design reviews were held for each of the first five software subsystems.

VERIFICATION AND VALIDATION The main principle used in developing Spacelab software is that it is necessary to build software in the correct disciplined way with full traceability from requirements through design to verification. Thus each software subsystem had a requirement document , wh i ch together identify 2 , 000 separate requirements, and each design listed the relevant requirements implemented there. Subsequent changes to requirements were then reviewed to see if corresponding changes to the VCD were necessary . Thus test procedures were produced to verify the requirements. Each test procedure was reviewed by Software QA, Software Engineering and the customer at a Test Review Board (TRB) to ensure complete ness and accuracy before being used. The test procedure was then debugged for both software errors and test procedure errors , prior to a forma l run by the Verification Engineer in the presence of the software QA of both contractor and customer. The test

Preliminary Design Review to look at the top level design and its relationship to the other subsystems, Critical Design Review to look at the detailed internal module designs . Both refer to the approved Requirement Specifications as their baseline . Here the QA role was to ensure that all problems in the design were documented , and were subsequentl y reflected in changes to the design documentation with full traceability .

S'..: / Ci J..C':-:~:IS

A:T~CH

--1--

O.-\':'A

1]

f l oc and S~';C9

- uai1

SP~ CO?Y r---~--'

i.:V;U.iA TI O:-J BY

P?CVrOE TO

~':.:"Lt.:;.,!,=:

SS-=~JG. :F

S~CS

OIS';CR1::E:::::1T

pCJ3LC:I / ? IX 6 i CO

'-----,..--J S;;C J,

DCCI:;E

+ __ {:cl ose - {c lose

?.::.;::C:-S~(CS!:;:?. ER~O~)f-_____________---\:-__

- us::

~.S

15

out S?R

f----.------------<(:------'- -

out SPR

- EC? REQUESTED

- - - - - [ l O g and close out SPR

- P.F',; ACCSPT.;aLE/ REJ£CTSD

_ _ _ _ -[lOg an d clo s e out SPR

- A:;r,L'fSIS / fIX BY CO - r.:-;,\LYSI3/F!:< BY ER

f----~-r--------=-f_--=--=-~~~tRiI

- H/ '..J ERROf'{ ~---------~

------

---

PATC,

I+T

;"'>;D

S·{ST.ENG ASOUT H/Ii ER ROR , CLOSE OUT SPR

- - - 1 H I P L E :'!EN T I N LIBR.'.R¥ DISTRlcUTE PATen OR RELEASE FOR REUS~ COLLECT S IH PA A.'1S;'E!\ F OR SUCC2SFULL S?~ RESOLUT:ON CLOSE OCT SPR

'I

--- .~-----{ log anJ c .. l l S"CB

L

lEGE ~~D

Fig . 3.

SPR SPA -

S/h' ?RODL~:-1 REPORT S I',' PItU!.lLt:;:t I\:I A L. 'lS iS

Rf\i -

R£QUCS\, FOn \,,\l\'EK

Software Problem Report Flow

Building Reliab l e Software for Spacelab results were added to the Test Development Folder for final approval by the TRB and c l osure of verification of the corresponding requirements. The TRB also approved verific ation by review or by analysis (desk check methods of verification). The set of Test Procedures were also used to measure performance of the software while the software was improved to increase through put, and provided a set of regression tests for successive releases . Test versions of real time applications were written to provide simulated loads on the operating system software . A major aspect of the final stages of verific ation was the use of a Real Time Simulation Facility (RTSF) of the ground computer, the on-board computer , their peripherals and a static simulation of the Space lab hardware . Not only was this necessary and valuable in ~erifying the operation of software running i n 2 different computers , but as it used ground versions of space hardware it was also used to find faults in the hardware too, thus saving time on expensive flight hardware.

157

In addition, there was a representative of hardware engineering in the SPR subcommittee. The process was subsequently completed by a Software Release Order (SRO) tying together a set of changes by reference to the cor r esponding SPR number, Document Change Notices (DCN) and Magnetic Tapes containing the software. This documentation, although extensive , can be mostly done by reference to or duplication of existing material. This method ensures consistency of documentation with software , it ensures a wide range of review of each change to help in identifying side - effects or inadequate testing, and it informs a wide range of users of the exact status of each problem and of each release. This procedure was used for over 2 , 200 SPR ' s during the cycle from integration of prelimi n ary releases up to final completion of verific ation. Figure 4 shows the mon thly rate at which software problems were discovered over the 2~ years of software integration, verific ation and acceotance.

SOFTWARE QA TOOLS SPR PROCESSING An elementary, but none the less effective and necessary tool in the search for reliable software is the Software Problem Report (SPR) and its ancillaries. The Space lab project used the following method , illustrated in Fi<]. 3. a)

b)

c)

Every problem that was detected in testing or using the software was recorded on an SPR. These were then copied and circulated to members of the SPR subcommittee of the Software Control Board , representing t he various functions of the prime contractor , subcontractors and customer. Eac h SPR was then either accepted and allocated to an engineer for analysis or rejected as a duplicate.

A Software Problem Analysis (SPA) was then produced that analysed the problem in software terms and proposed a solution, or , in some cases, explained the user error, requirement change needed , hard ware problem , etc. In all cases , relevant documentation changes were identified . For each software change , a Software Maintenance Disposition (SMD) was produc ed which described the change , the tests done and the documentation involved. SPR ' s that required changes to the requirements were referred to the main Software Control Board (SIVCB) for dispos ition. SPR ' s that came from the hardware engineers during use of the software for hardware integration were linked by number to a Discrepancy Notice (DN) which stayed open until the SPR was closed.

No automated tools were planned at the start of this project , and by the end only an SPR tracking list and a VCD were maintained on the 1BI1/370 computer . One contractor used a Compare program on all new releases of software modules to report the changes from the previous version. Most contractors used some form of library structure for development of modules, with a separate library for completed modules , but the source was not always up to date with the executable code. A software standards document was applicable to the whole project with local variations according to programming language and computer . The best idea at the module level was the Unit Develooment Folder (UDF) which contained a copy of all relevant documentation for the module i.e. requirement , hard ware interface , design , code , test procedure, test data , test resu l ts , QA report and SPR ' s . There was extensive manual tracking via SPR ' s, VCD , DCN, SRO, to ensure that each delivery was fully documented and all open problems were identified. This was particularly necessary for the intermediate releases and acceptance from subcontractor to prime contractor and later to hardware integration and then to ESA/NASA.

1' . J. Robinson and R. K. Atki n s

158

Sof t wa re Pr obl em Di scovery Rate Month l y rate

1 7 5~~~~~-~~~---------------------

I

I I I

i

iI

!

i5

Fig . 4.

Software Problem Discovery Rate

CONCLUSION Re li able software can be built on an indus tria l sca l e by following the full Software Engineering approach -requirements , design , code , test and verify - with reviews at each stage and traceability f r om requirements th r ough to the finished product. Particu l ar ly important is the full ver i ficat i on of the software under i ndependent QA supervision to cover every detail of the software. Fig . 4 shows the resulting fall in Software P r oblems by fo l lowi ng this method .

iE