146
Conference Reports
the business aspect of this problem, namely, what are the cost benefits of such an integration, and how wide a market is there for such integrated systems? These are the questions that the panelists addressed in this panel, focusing on the issues that they thought are the most critical.
dangers that computers pose on our privacy. And, in a fourth invited lecture of this program area, G. Gupta talked about the impact of computers on developing countries. There were Panel Discussions on computer induced unemployment and on computer education. There were also discussions on what intelligent machines may do to us and how vulnerable society is to dangers from full scale introduction of computers.
8. Microprocessor Applications Microprocessors find applications everywhere from cars to houses to factories. There were talks in this conference area on microprocessor design and technology, notably by G.E. Moore on the evolution of microprocessors, by C. Mead on ultra parallel designs, by T. Matsumura on future trends, and by J. Nicoud by standardization issues. There were Panel Discussions held on aspects of Robotics and how it will change factories. Discussions were also held on the VLSI chip and how they should be designed into systems.
10. Computers in Everyday Life Computers are slowly and visibly influencing many aspects of our everyday life. In their Invited Lecture, S. Hiltz and M. Turoff talked about this effect. A. Bisseret discussed how we can live with computers. Panel Discussions were held on personal computers and citizen participation in the informatics revolution. There was also a panel on computer literacy for the average person. Finally, after witnessing the art of programming, the audience was able to see computers generate art.
9. Social and Economic Implications Are we better off with the computer and the information revolution, or not? D. Dennett discussed how computers should fit people's needs. N. Longworth discussed how to educate people for the use of computers. A. Westin presented the
The proceedings of IFIP'83 - the 9th World Computer Congress have been edited by K.E.A. Mason and published by North-Holland under the title Information Processing 83. 1983. xvi + 964 pages. ISBN: 0-444-86729-5. price: US $98.00 (USA/ Canada), Dfl.260.00 (rest of the world).
Protocol Specification, Testing and Verification The need for standards in computer communication systems and the growing complexity of functions covered by these standards have created a burgeoning demand for carefully conceived techniques to specify, design, verify, implement and test computer communication protocols. The Workshop at the IBM Zurich Research Laboratory in Ruschlikon (Switzerland) from May 31 to June 2, 1983 was
the third in a series of workshops organized under the auspices of IFIP Working Group 6.1 devoted to this theme. As the meeting was a workshop, the requirement for attendance was submission and presentation of a paper. The participants came from a wide variety of institutions, including universities, government laboratories and computer manufacturers in ten different countries. This workshop has
Conference Reports made an important contribution to the exchange of ideas between the research workers and those involved with the practicalities of developing communication systems, according to the Workshop organizers Harry Rudin and Colin West. The organization of such a meeting c o n s u m e s considerable resources. IBM Zurich Research Laboratory provided the support which made this c o n f e r e n c e possible. Due to the interest that this workshop has created, another workshop will be held in North America in the spring of 1984.
Section 1. Protocol Theory and Analyses Interval Logic R.L. Schwartz, P.M. Melliar Smith and F.H. Vogt (SRI International, Menlo Park, Ca., USA) presented a new interval-based temporal logic. The logic, the lecturer said, stems from experience in using temporal logic for specifying protocols standards. The use of intervals to establish a context for temporal assertions provides a high-level structure for protocol specification. The lecturer presented an informal introduction to the logic, and illustrated with examples of asynchronous queues and the Alternating Bit protocol. Selection / Calculus Model S, Aggarwal, R.P. Kurshan, and K. Sabnami (Bell Laboratories, N.J.) described the main features of the selection/resolution model. This classical Alternating Bit protocol, described by the lecturer in realistic detail is then used to illustrate specification, analysis and "validation" techniques. The speaker included a description of the protocol. The Alternating Bit protocol was initially described in terms of a dozen simple, concurrent, interacting processes. The lecturer showed how coordinated behavior can be computed. Methods to analyze the protocol were discussed. The speaker reported that by way of comparison, this specification and analysis of the Alternating Bit protocol can be related to others which have appeared in literature. A Specification and Analysis Language S. Aggarwal, R.P. Kurshan and D. Sharma (Bell Laboratories, N.J.) told the conference that the
147
selection/resolution model for concurrent processes can be used in theory for design, specification, analysis and implementation of complex concurrent systems, assuming the availability of supporting software. Such software must faithfully represent the model and yet have sufficiently efficient storage structures. To be of practical use, the lecturer said, it also must have a user interface with mechanisms for error control, naming and system development of hierarchical structures and the ability to compare different versions of components. The speaker described the ongoing effort to design such a software system. It will be a general coordination analyzer and specifier (thus named COSPAN) not limited to protocols.
Modelling Elapsed Time S. Aggarwal and R.P. Kurshan (Bell Laboratories, N.J.) told the conference that in the analysis of communication protocols, it is often useful to incorporate timing information that specifies the elapsed time associated with sequences of operations. The lecturer gave the example that in order to determine proper setting of a timer, one needs information on the expected elapsed time between message transmission and acknowledgement. The speaker went on to describe how timing information may be modelled, using the formal selection/ resolution model for concurrent processes. The classical alternating bit protocol was used by the speaker to illustrate the concepts. Communicating Machines by Step Wise-Refinement M.G. Gouda (University of Texas, Austin) considered the problem of constructing two finite-state machines that communicate by exchanging messages via two, one-directional, unbounded, FIFO channels. The two machines, the lecturer said, should be constructed such that their communication is guaranteed to progress indefinitely. A methodology to solve this by succession of refinement steps was discussed. At each step more nodes and edges are added to the two machines constructued so far; this continues until the required two machines are realized. The speaker illustrated the usefulness of this methodology by using it to construct two communicating machines which model the call establishment/clear protocol in X.25.
148
Conference Reports
Section 2. Specification and Formal Models
The Power of Formal Models R. Gustavsson and B. Pehrson (Uppsala Institute of Technology, Sweden) compared two formal techniques for modeling of concurrent systems. They were "Communicating State Machines" and "Calculus Communicating Systems" (CCS). A variant of the Alternating Bit protocol was used as an illustrating example. Implementation specifications are designed from an informal protocol specification. The behavior of the composed entities is given in each formula and transformed within each theory. The lecturer went on to say, that apart from liveness, the implementation specifications are observationally equivalent to the service specification. The speaker showed in the CCS-based example, how interval temporal logic can be used to achieve proofs of total correctness. Both techniques support incremental design which is desirable in an interactive design system, according to the speaker. Abstraction by Structural Reduction B. Pehrson (Uppsala Institute of Technology, Sweden) presented a technique to reduce the functional descriptions of a set of connected components into a less complex functional description of the composed system. The speaker demonstrated this technique by verifying the data link service provided by the Alternating Bit protocol. The protocol specification is reduced into the specification of a queue. The basic idea is to abstract away all events which do not affect the behavior of the composed system according to an equivalent criterion. This technique provides a powerful tool for mechanizing formal synthesis and verification in a hierarchical manner, Pehrson said. It has so far been used together with abstract machine descriptions with a finite number. However, it is a general method, the speaker told the conference, which could be used with other specification methods. The method is implemented in the design system Caddie and has been used to verify some simple communication protocols. Structured Finite State Automata S. Budkowski and E. Najm (Agence de l'Informatique, Paris-la-D6fense, France) presented and
formalized a new modelling technique (Structured Finite State Automata-SFSA) which permits finite state automata to be structured so that operations such as direct coupling and projections may be easily described and accomplished. The lecturer then briefly illustrated and commented on how the techniques may be applied to describe and validate distributed communication systems. The speaker also gave a simple example of techniques applied to validate the cooperation of Session/Transport adjacent entities in a local system. Constructive and Executable Specification L. Logrippo (University of Ottawa, Canada), discussed some of the problems connected with the formal specification of protocol services and proposed some positive solutions. The lecturer introduced concepts of "instructive" and "executable" specifications presented a model for the constructive specification of protocol services that is based on the combined use of finite-state transducers and abstract data types. The speaker also presented a technique for executable service specifications that uses a combination of abstract data types and finite-state-automata concepts. This technique, the speaker went on to say, enables the definition of the transport service in a manner that is precise, terse and abstract. This concept seems to hold great promise for the definition of higher level protocols and services that may involve complex data manipulation functions. Similar techniques, the speaker said in conclusion, have been shown elsewhere to be eminently suitable for the purpose of formal verification. A Behavioral Description Language G. Karjoth (University of Stuttgart, Fed. Rep. Germany) told the conference that in the behavioral description language applied is a process algebra to the specification of protocols in distributed systems. Individual system components are described by their interactions which are observable in the outside world and represent multi-way synchronized communication over explicit interaction points. The semantics of the language are defined by temporal logic axioms, using Wolper's relativization procedure. The speaker said that they provide a mathematical framework for the analysis of protocols and for developing logical systems for proving their properties.
Conference Reports
In conclusion, the speaker said that it is hoped that the first link has been made between abstract requirement specifications given in pure temporal logic and more readable "normal form" specifications given in algebraic expressions.
Session 3. Theory and Applications of Petri Nets Tools and Studies for Formal Techniques M. Anttila, H. Eriksson, J. Ikonen, R. Kujansuu, L. Ojala and H. Tuominen (Helsinki University of Technology, Finland) discussed some experiences of using the place/transition-net analyzer. The lecturer described to the conference the work of developing tools for a Petri Net laboratory. In the temporal logic domain an approach of using temporal logic to describe Petri nets was shown. In the examples that the speaker showed, the formulas describing CE-systems are quite long and complicated to carry out manually. One solution, in the speaker's opinion, would be to mechanize some decision procedures. There are many open questions in this area. The speaker said it might be more beneficial to use branching time structures to model nondeterminism in CE-systems. Another interesting topic would, the lecturer concluded with, be to describe high-level nets using quantified temporal logic. Timed Petri Nets
Timed Petri Nets are ordinary Petri Nets with additional elements for modelling time. B. Walter (University of Stuttgart, Fed. Rep. of Germany) introduced several types of timed Petri Nets for modelling network protocols that make extensive use of timers as well as the time behavior of the physical system. Three types of net were considered in the presentation: Condition/Event Nets, Place/Transition Nets and Predicate/Transition Nets. It was shown how to analyze timed Petri Nets and how to check the validity of the modeled timers. In particular, the speaker showed how to model message delayers and timers. Time Petri Nets M. Menasche and B. Berthomieu (National Center for Scientific Research, France) spoke about
149
modelling and proving correct concurrent systems in which time appears as a parameter (such as communication protocols). Merlin's time Petri Nets were used for modelling these systems and a recently developed enumerative method was employed for analyzing their behavior. This method was applied to the specification and verification of a data transfer protocol and a bus allocation protocol. Alternative and complementary methods for analyzing Time Petri Nets are being investigated, the speaker told the conference. These include reduction rules for preserving some properties and structural methods making use of the structure of the net for deducing properties of its behavior. Also being investigated is extending the field of application of the method towards performance analysis, the lecturer concluded. The 1SO Transport Service J. Billington (Telecom Australia Research Laboratories, Australia) presented a formal specification of the ISO-Transport System Definition. This specification applies to a single instance of a connection. Six phases of the connection are specified by simple separate numerical Petri Nets which may be easily combined to obtain the total specification. The invocation of a service primitive, the speaker went on to say, has been associated with the firing of a transition using a label. The execution of NPN then describes the allowable sequence of Transport Service primitives and the relationship between these at both ends of the connection. The speaker concluded with the claim that NPNs are a powerful graphic technique for the specification of ISO services. The merits of NPNs as a formal description technique are currently being debated within CCITT and ISO, reported Billington.
Session 4. Validation and Verification VA DILOC O. Rafiq and J.P. Ansart (Agence de l'Informatique, France) presented a methodology for description and implementation of OSI-oriented communication protocols. One important step, the
150
Conference Reports
speaker said, is the translation of the informal description into a description using extended finite state automaton using predicates. This automaton describing the behavior of an entity for one connection is first checked for correctness (consistency, state reachability, etc.) before it is used for a description based on programming language and for protocol validation. After some experiences of VADILOC/BS with protocols using simple messages, an extension has been made, introducing new functions. This extension, called VADILOC/ES (extended system), is more suitable for high level protocols for manipulation variables. Both systems, the speaker concluded with, are written in PASCAL and run on the CII-HB Multics system at I N R I A (Rocquencourt, France). Link Initialization Procedure A.E. Baratz and A. Segall (IBM Thomas J. Watson Research Center, New York) told the conference that it is known that HDLC and other bit DLC (Data Link Control) procedures ensure data transmission reliability on noisy links provided that all transmission errors are detected and the link processes are synchronized at initialization. The most commonly used DLC procedures are the bit-oriented DLC procedures such as HDLC, SDLC, ADCCP or Alternating Bit. In this presentation, the speaker showed that the HDLC initialization procedure does not ensure synchronization and thus allows inadvertent loss of data. The speaker proposed a new link initialization procedure and proved that it does ensure synchronization. Automated Protocol Verification H. Eckert and R. Prinoth (Gesellschaft fi~r Mathematik und Datenverarbeitung, Fed. Rep. of Germany) first presented a short introduction of a specification tool for communication protocols, in particular for those protocols having a potentially unbounded set of reachable states. The mathematical foundation of the specification method is such that it is possible to compare different specifications of the same protocol by means of homomorphisms. The speaker also presented a verification method. This combines the developed specification
tool and the structured principles of the ISO reference model. The main feature of the method is that it makes possible to prove that a protocol provides a service and uses an underlying service correctly, according to the lecturer. A complete system for the automated verification of protocols has been implemented. The lecturer provided the conference with examples that illustrate both the specification and verification method. Experience niques
With Automated Verification
Tech-
C.A. Sunshine (University of Southern California, Marina del Rey) reported to the conference that at his institute four automated verification systems were applied to a common set of communication protocols to assess their capabilities. The systems and their key features were Affirm, Gypsy, Concurrent State Delta and Formal Development Methodology. Each system, the lecturer told the conference, showed different strengths in specifying protocols and verifying their correct behavior. The presenter's experience showed that important features of real protocols can be handled by current automated systems, but a great deal of effort and ingenuity is required and further development efforts are needed before real protocols can be fully and routinely verified. Verification via Executable Logic Specifications D.P. Sidhu (SDC, Pennsylvania, USA) discussed the uses of logic programming techniques in the specification and verification of communication protocols. The protocol specifications discussed are formal and directly executable. The advantages of executable specifications, Sidhu asserted, are (1) the specification is itself a prototype of the specified system, (2) incremental development of specifications is possible, and (3) behavior exhibited by the specification when executed can be used to check conformity of specification with requirements. The speaker discussed Horn clause logic, which has a procedural interpretation, and the predicate logic programming language, PROLOG, to specify and verify the functional correctness of protocols. PROLOG possesses a powerful pattern-matching feature which is based on unification, Sidhu concluded.
Conference Reports
Session 5. Protocol Performance Industrial Ethernet Local Networks G. Florin (CERCI, France), S. Natkin, A. Woog and J. Attal (CNAM, France) told the conference that local networks have to be built using low cost interfaces. Integrated circuits for CSMA-CD protocol are not available and would be very useful for such applications. The major problem which arises with CSMA-CD is the non-deterministic bound of the response time. Whether CSMA-CD techniques are adequate for control process applications can be validated only by probabilistic techniques. The speaker presented general methods to validate CSMA-CD industrial networks response time characteristics. These methods were applied to a highly constrained application (the control of an energy power plant). The main topics in this lecture were: (a) Characteristics of control process applications, (b) The probabilistic assumptions to be validated and the statistical tests to check such assumptions, (c) Simulation of the transient behavior of Ethernet. Lastly, the lecturer presented the main numerical results. They allow the definition of conditions to be fulfilled for accepting CSMA-CD control process networks. Automated Performance Prediction H. Rudin (IBM Zurich Research Laboratory, Switzerland) described some first steps in using a formal definition as the basis for the automated prediction of protocol performance. Rudin considered a simple example, and presented a technique for predicting protocol performance directly and automatically from the kind of formal machine-readable definition now often being used for concise protocol specification. As time goes on techniques for predicting protocol performance direct from a protocol's formal specification will be increasingly in demand, according to Rudin.
Session 6. Protocol Design and Implementation Design of a Couple Service-Protocol G. Juanole and B. Algayres (Laboratoire d'Automatique, France) elaborated on the design of
151
Transport Service-Transport Protocol couples. The combination of Service and Protocol is emphasized because the design of a protocol is closely related to the service it must provide. A three-level model which provides insight on the design specification and a method to specify well designed couples were presented. The speaker told the conference that an important result of this is the specification of the conditions under which protocols (either with a two way or with a three way handshake scheme) have to be used. The lecturer also presented a Petri Net model of a couple which uses a three way handshake scheme. The speaker went on to say, that this model enables to view the relations between the service and the protocol and to verify the logic of their interactions.
Transport Layer Protocol for Special Purpose LA N C. Boccalini (I&O, Genova, Italy), W. Ansaldi, M. Olobardi and A.M. Traverso (Ansaldo S.p.A., Genova, Italy) spoke about the system named MODIAC which is a local area network whose stations can be configured as mono- or multi-processor nodes based on the Z-8000 microprocessor. The environment in which the automation system works, the lecturer said, is usually characterized by strong, electromagnetic noise affecting message correctness. The speaker went on to say that the message transmission delay is often critical for the controlled plant safety. This led to the design of a communication subsystem where transmission interfaces are connected to a serial bus with a token passing policy. The four lower protocol layers reside on the interface boards and the higher ones on the CPU boards. The lecturer described the choices made for the transport layer design and the considerations which led to the decision making. Implementation of Assembl) Language F.M. Restorick (Plessey Office Systems, Nottingham, UK) described a method used to implement a transport layer protocol in the 8086 assembly language. The protocol implementation works under a multi-tasking executive. This method consists of a funnel stepper, a state table and a collection of action modules. This approach, as well as reducing the processing time necessary to interpret the protocol, allows easy implementation
152
Conference Reports
of a trace facility to be included in the system at debug time, and allows coding and programming of the action modules to be pooled between many programmers at the design stage. Restorick described in detail the method used to realize the state tables in 8086 and the function funnel stepper. He also explained to the conference the methods used to test the systems.
Session 7. Integrated Systems Development of Services with CIL H. Krumm and O. Drobnik (Institut fiar Informatik III, Karlsruhe, F.R. Germany) presented an introduction to the language CIL (Communication Service Implementation Language) and also presented the theory and its application to specification and verification purposes. The CIL approach for the development of communication services is based on the CIL language and a CIL-compatible theory of the program execution. The programming language provides for structuring concepts to support the design and the implementation of services. The theory contains a logical language to express specifications and axioms of program semantics, an event oriented-model of program execution, and a first-order predicate calculus to perform verification by means of deduction in the calculus. Secure Communications Systems R.E. Strom and S. Yemini (IBM T.J. Watson Research Center, New York) discussed those features in the NIL language which make valuable during the design, implementation, validation and testing phases of communication systems. These features were: (a) a process model in which shared and global data does not exist, thereby supporting concurrency and modularity in a single construct; (b) queued communication which results in a high degree of uncoupling between modules and permits truly modular verification of NIL systems; (c) run-time operations for loading processes and binding communications channels; (d) full specification of inter-module interfaces and complete compile-time checking of the consistency between code and interfaces; (e) typestate checking, a subset of program verification performed automati-
cally by an NIL compiler, which limits the extent to which unvalidated programs can corrupt validated ones through dangerous side effects. The lecturer also discussed experience in using NIL as both a design and an implementation language for SNA. The LC / 1 Language J.M. Ayache and J.P. Courtiat (Laboratoire d'Automatique, France) described the basic features of the communication language LC/1. It supports a global approach including the protocol specification, validation and implementation. It is essentially based on the use of the ISO reference model and Petri Nets. The lecturer told the conference that the overriding concern during its design was to validate in the most complete way the specifications by means of existing tools, like OGIVE/OVIDE and to minimize any designer's intervention between specification and implementation stages. The language compiler is being developed, while the simulator kernel has already been tested, and (the speaker concluded with) an implementation of the RHIN transport protocol is also being developed along the same line. CUPID Y. Yemini and N. Nounou (Columbia University, New York) described research conducted towards Columbia's United Protocol Information and Design (CUPID) environment. CUPID research aims at the integration and automation of protocol design and implementation tools. CUPID uses algebraic representation of protocols based in part, upon a variant of Milner's calculus of communicating systems (CCS). Communication behaviors are represented in terms of expressions of a universal algebra. A key notion to the automation of protocol development functions is that of a valuation over the algebra of communication behaviors. A valuation maps communication behaviors to expressions in other algebras (e.g., an algebra of delay distribution used for performance analysis). This allows, explained the lecturer, one to proceed and compute attributes of communication behaviors over the respective algebras using a formal valuation process. A brief introduction to CCS in the context of modelling protocol behaviors was provided by the
Conference Reports
speaker. This was followed by a summary of how the algebraic valuation mechanism may be used to support the different functions of the protocol design environment. PA ND ORA G.J. Holzmann and R.A. Beukers (Delft University of Technology, The Netherlands) told the conference that the protocol design and analysis system named "PANDORA", a joint developing project with the Netherlands PTT and the Delft University of Technology, provides its users with a controlled environment for protocol synthesis and formal analysis. PANDORA also offers both software and hardware tools for protocol assessment. PANDORA can assist the user in documentation of protocol designs by autonomously extracting SDL-diagrams, and has a set of tools for the generation of executable protocol implementations from abstract specifications. Automated Protocol Development System P.T. Blumer and D.P. Sidhu (SDC, Pennsylvania, USA) gave an overview to the conference of a formal specification technique and implementation method for computer communication protocols. The technique that the lecturer described was developed at Bolt Beranek and Newman. A collection of useful software tools was also discussed. The speaker focussed on a tool called the finite state machine (FSM) analyzer, which can be used with this technique to verify certain protocol properties. The speaker described the application of the analyzer to an authentication protocol and gave some interesting results. From Formal Description to A utomated Implementation J.P. Ansart, V. Chari and D. Simon (Agence de l'Informatique, France) gave a brief overview of the basic concepts of the PDIL language (Protocol Description and Implementation Language) through an example of a description. The basic ideas underlying the PDIL translator were outlined by the speaker. This translator is now available, the conference was told, on a Multics system. The lecturer also explained how to pass from a PDIL formal description to implementation by dealing with all the choices.
153
The lecturer went on to explain that the formal description in PDIL should (a) be able to describe the protocol clearly and completely without enforcing over-specification, (b) serve for verifying the correctness of the protocol, and (c) be able to derive an implementation in as much possible automated way.
Session 7. Protocol Testing Layer-Independent Architecture S. Palazzo, P. Fogliata and G. LeMoli (CREI, Milano, Italy) introduced to the conference an architecture for a system performing the testing of a generic OSI layer. It was shown that the system proposed can be used to test the protocol implementations in terms of both protocol testing and service testing, either in debugging or in certification phase. The structure of the system, the speaker told the conference, is designed in such a way as to point out what is independent from the layer in which the protocol being tested lies. Finally the functional specification of the modules composing the system was described. Testing and Diagnosis Aids" A. Giebler (Institut fi~r Datenfernverarbeitung, Fed. Rep. of Germany) gave an overview of a special protocol tester which has been developed by the GMD (Gesellschaft fgr Mathematik und Datenverarbeitung) within the TESDI project (TESting and Diagnosis aid for higher protocols). The lecturer discussed the following subjects: (a) the concept of the protocol tester; (b) the applied testing method; (c) the different testing functions; (d) the used implementation concepts; and (e) an example of a telex (transport layer) test. User Guided Test Sequence Generation H. Ural and R.L. Probert (University of Ottawa, Canada) presented a computer-assisted approach for generating test sequences from specifications of communications protocols and services. The approach is based on using attributed context free grammars and is directly applicable in a logic programming environment. The speaker said that the approach involves constructing test sequence
154
Conference Reports
specifications in attributed context free grammars, implementing these specifications in logic programming as generators, and executing the generators in a controlled fashion to generate test sequences. The lecturer illustrated the approach on transport service and protocol specifications. Benefits include improvements in test design, specification, documentation and management.
Requirements for a Test Specification Language
which have been developed to exploit the advances in specification and the impact these tools have had on the testing and implementation of SNA products. At the present time, the speaker told the conference, a Format And Protocol Language (FAPL) is used for SNA specification. This language is used not only in IBM's external publications which describe the architecture, but also in the production of a machine-readable, executable description of SNA. After consideration of a theoretical approach to product protocol testing involving this executable definition, the speaker went on to describe some of the techniques which have been applied in the real world of IBM products. The lecturer concluded with a look to the future both within IBM and in non-SNA-related projects.
R.L. Probert and H: Ural (University of Ottawa, Canada) examined the application of the notion of a test specification language to various issues in the testing of protocol implementations. Sources of language design constraints, such as limitations imposed by the test session architecture, were discussed. Also the speaker discussed the effects of relationships among language features, degree of distribution of test control, the design of properties of test support tools and test initialization and reporting requirements. The speaker concluded with a progress report on a prototype test specification language for specification-based testing of protocol implementations.
G.A. Harvey (Digital Equipment Corporation, Massachusetts, USA) described the design and construction of Routing Certification System of (RCS) for testing conformance of a node to selected aspects of the Routing Layer protocol, as specified by the Digital Network Architecture (DNA) of Digital Equipment Corporation.
Qualitative Protocols Validation
Protocol Product Testing
J.F. Billiard (CAP Sogeti Logiciel, France) presented four basic rules which he has found useful in obtaining significant validation results. Due to the vast number of tests that can be performed, the speaker proposed a tentative classification according to the function. Tests can be divided into two classes: qualitative tests and load acceptance tests. Qualitative tests are designed to control the "communication machine" - node, network, host or gateway - observes its protocol and that its supplied facilities such as routing, billing, statistics, etc. are correct. The speaker told the conference that load acceptance tests are designed to control the possibilities of the machine in terms of data packets per second or maximum number of simultaneous communications.
G.W. Cowin, R.W.S. Hale and D. Rayner (National Physical Laboratory, UK) introduced the concept of an Assessment Center for Testing Open Systems Interconnection (OSI) protocol products. Physical architectures for assessment were compared by the lecturer, and the general logical architecture was discussed. The speaker also compared different approaches for the test design of 'Test Responder' and 'Encoder/Decoder' modules, drawing on practical experience. A comparison was given of the two test definition methods in use of NPL. The speaker concluded that some useful lessons have been learned from the earlier experience of using this philosophy and architecture. The lecturer felt that much more experience can be gained from NCC using the testing tools in the pilot UK Assessment Center.
The Routing Certification System
Testing of Protocols in SNA Products Objective Understanding of Conformance R.M.S. Cork (IBM, England) focused his talk on the evolving specification of IBM's System Network Architecture (SNA), some of the tools
D. Rayner (National Physical Laboratory, UK) told the conference that currently all conformance
Conference Reports
testing of protocol implementations is subjective. Each organization involved, the speaker went on to say, is likely to have its own interpretation of what constitutes conformance to a particular standard. The problem arises from poorly defined standards. The definition of the protocol itself is often confused with additional procurement requirements for implementation of the protocol. The elimination of this and other sources of ambiguity was discussed by the speaker. A checklist was provided at the conference which, the speaker believed, could assist progress towards an objective understanding of conformance and therefore define objective conformance tests.
Testing Implementations of OSI Protocols R.L. Linn and W.H. McCoy (Institute for Computer Sciences and Technology, Washington D.C.) explored problems associated with protocol test design, semantics and completeness. A linguistics approach utilizing a generative grammar augmented with probability distributions associated with the production rules and random selection was used to produce test sequences for the NBS/ICST implementation of ISO Class 4 Transport protocol. The lecturer also presented advantages and limitations of the methodology.
155
Testing Tools for OSI Protocol Implementation R.J. Linn and J.S. Nightingale (Institute for Computer Sciences and Technology, Washington D.C.) described specific tools within the test architecture which has been developed and refined using a prototype implementation of the ISCT Class 4 Transport Protocol. The language used for executing the tests, the speaker said, is based on representations of the service primitives of the layer under test. All possible combinations of service primitives can potentially be specified using this language. Errors are introduced into the protocol under test in a controlled manner by means of an Exception Generator which resides between layers three and four at the Test Center. The language which drives this tool provides the mechanism to edit protocol data units, concluded the lecturer. The proceedings of this conference have been edited by H. Rudin and C.H. West and published by North-Holland under the title Protocol Specification, Testing, and Verification, II1. xiii + 531 pages. ISBN: 0-444-86769-4. Price: US$65.00 (USA/Canada), Df1.170.00 (rest of the world).