Information and Software Technology 41 (1999) 695–696
Guest editorial
Communications software engineering (CSE) K. Saleh a,*, R. Probert b a
Department of Electrical and Computer Engineering, Kuwait University, P.O. Box 5969, 13060 Safat, Kuwait b University of Ottawa, Ottawa, Canada
In the late 1960s, Randall and Buxton introduced the term “Software Engineering” in an international NATO conference, perhaps with the intention of drawing attention to the need for a professional, measurable process for designing, developing and maintaining software. In 1980, Thomas Piatkowski [1] introduced the term “Protocol Engineering” to describe the evolving technology and research domain that involves formal methods and sound engineering principles. Shortly thereafter, a number of researchers recognized the importance of software in communicating systems and defined “Communications Software Engineering” to emphasize a variety of key aspects of the software development component of the development of networks, communicating systems, communication services, webbased systems, and of course protocols [2–15]. Thus, we include Protocol Engineering in this special issue as an extremely important component of Communications Software Engineering. From the beginning, the two attributes of the CSE process which garnered the greatest attention have been “quality” and “productivity”. Just as in the 1990s, Kan and others introduced terms and processes for Software Quality Engineering (SQE), Probert and Lew introduced Protocol Quality Engineering. In both fields the emphasis was on the quality (capability, usability, reliability, robustness, performance, etc. of the designs and products, and the productivity of the design, development, and quality engineering processes. While the desire for quality led to the development of Formal Description Techniques (FDTs) and associated standards, the acceptance of FDTs into the workplace had been very slow, at best, until quite recently when industrialstrength CASE (Computer-Aided Software Engineering) tool sets (also denoted environments began to make inroads. The major internationally standardized FDTs include ESTELLE, LOTOS, and SDL (the Specification and Description Language). SDL is by far the best supported and (consequently) the most widely adopted of the major FDTs. The FDTs are intended to provide a useful syntax and * Corresponding author.
a precise semantic model for rigorously representing communicating systems. The other half of the design and development process is validation and verification, implemented most of the time by requirements reviews, design inspections, component testing, and system testing. A major productivity concern was the high cost of this quality assurance activity, to the degree that product quality and software engineering productivity seemed to be conflictive requirements. One particular area of testing cost was seen to be in the area of manual component and system testing, and especially in the area of regression testing, verifying that fixes and enhancements to a released product did not degrade the performance or reliability of the product, i.e. the product had not regressed (gotten worse). Fortunately, a group of testing experts started work in 1984 on a means of automating a key part of this testing, called conformance testing. The result was a very practical, system-independent test description language called Tree and Tabular Combined Notation (TTCN) which could be used in conjunction with Abstract Syntax Notation (ASN.1) to specify test suites in a way which could be shared among all practitioners, both designers and quality engineers. TTCN (as SDL would be) was practical because it was well supported by commercial CASE tools such as ITEX for test design, development, automated execution and test management. TTCN became an international standard language for testing in 1994. At the same time, the object-oriented paradigm was beginning to mature, and a small number of scalable, industrial-strong CASE tools began to win significant acceptance in industry. Among these are TAU from Telelogic (which includes the SDL Design Tool (SDT) and ITEX, the TTCN test engineering tool), Object GEODE from Verilog, UMLRT from ObjecTime, and Rational ROSE from Rational Software. MSCs (Message Sequence Charts) and UML (Unified Modelling Language) diagrams have become standard methods for representing requirements and are supported by these industrial-strength CASE tools. Thus, the discipline has matured to the point where both quality and productivity objectives can be met. The articles in this issue present both a broadening and a deepening of
0950-5849/99/$ - see front matter q 1999 Elsevier Science B.V. All rights reserved. PII: S0950-584 9(99)00030-0
696
K. Saleh, R. Probert / Information and Software Technology 41 (1999) 695–696
the theory and industrial practice of Communications Software Engineering. The first two articles deal with requirements specifications and capture. The first, ‘A service creation environment based on scenarios’, by Dssouli et al., presents a service creation environment for the elicitation, integration, verification, and validation of scenarios during the requirements capture phase of the software development process. The second, ‘A model-based authorware for the construction of distributed multimedia systems’, by Cheung and Chanson, proposes a requirement capture approach for distributed multimedia systems based on the Java and CORBA technologies. The next article deals with protocol specification and its relationship to testing. The article, ‘Communications software design for testability: specification transformations and testability measures’, by Dssouli et al., proposes the integration and application of transformations and testability measures based on the finite state machine model early in the communications software development process. The next two articles deal with protocol design issues. The first, ‘A protocol synthesis method for fault-tolerant multipath routing’, by Ishida et al. proposes a method for the synthesis of fault-tolerant multipath routing protocols starting from service specifications all described using the finite state machine model. The second, ‘A medium access control protocol for voice and data integration in receiveroriented DS-CDMA PCNs’, by Yang et al. proposes a medium access control protocol for voice and data integration along with its analysis and simulation results. The next four papers deal with protocol testing. The first, ‘Controllability and observability in distributed testing’, by Cacciari and Rafiq, discusses ideas for the development of a framework for testing distributed systems based on the controllability and observability features. The second, ‘A framework for the specification of test cases for real-time distributed systems’, by Walter and Gabrowski, presents an extension of the conformance testing methodology and framework to specify test cases for testing real-time requirements of distributed systems. The third, ‘Efficient checking sequences for testing finite state machines’, by Inan and Ural, proposes a general model for constructing minimal length checking sequences for finite state machine description of protocols, and presents heuristic algorithms for the construction of minimal length checking sequences, The fourth, ‘Passive testing and application to the GSM–MAP protocol’, by Tabourier and Cavalli, proposes algorithms for checking whether protocol traces satisfy protocol specifications described as finite state machines. The next article deals with implementation issues and tool support. The article, ‘Extending EASE with new ASN.1 encoding rules’, by Lai and Cheong, extends EASE, a tool
for producing specifications based on Estelle and ASN.1, with two new encoding rules for the ASN.1 standard. The last article deals with protocol design recovery and reverse engineering. The article, ‘Recovery of CFSM-based protocol and service design from protocol execution traces’, by Saleh et al. proposes some reverse engineering algorithms to generate finite state machine descriptions of protocol and service design starting from collected run-time execution traces. Acknowledgements We would like to thank the authors who contributed to this special issue, and the referees who have reviewed the articles and contributed their comments to enhance the content and presentation of the invited articles. We also would like to thank the Journal Editor, Mr. Michael Dyer, for hosting this special issue and for his encouragement and support during the preparations for this issue.
References [1] T.F. Piatowski, An engineering discipline for distributed protocol systems, Proceedings of NATO Advanced Study Institute: New Concepts in Multi-User Communications, Norwich, UK, August 1980. [2] T.P. Piatowski, The State of the art in protocol engineering, ACM SIGCOMM’86 Symposium, Stowe, Vermont, August 1986, pp. 13– 18. [3] H. Rudin, Protocol engineering: a critical assessment, Proceedings 8th IFIP Symposium on Protocol Specification, Testing and Verification, Atlantic City, NJ, June 1988, pp. 1–18. [4] M.T. Liu, Protocol Engineering, in: M.C. Yovits (Ed.), Advances in Computers, 27, Academic Press, New York, 1989, pp. 79–195. [5] Special Issue on Protocol Engineering, IEEE Trans. Computers, 40 (4) (1991). [6] Special Issue on The Rise of Protocol Engineering, IEEE Software, January, (1992). [7] B. Sarikaya, Principles of Protocol Engineering and Conformance Testing, Ellis Horwood, NJ, 1993. [8] G. Holtzmann, Design and Validation of Computer Protocols, Prentice Hall, Engelwood Cliffs, NJ, 1991. [9] K. Tarnay, Protocol Specification and Testing, Plenum Press, New York, 1991. [10] IFIP Symposium on Protocol Specification, testing and verification (PSTV) (yearly since 1980). [11] IFIP Symposium on Formal Description Techniques for Distributed Systems and Protocols (FORTE) (yearly since 1989). [12] IFIP International Workshop on Protocol Test Systems (IWPTS) (yearly since 1988). [13] International Conference on Network Protocol (ICNP) (yearly since 1993). [14] International Conference on Computer Communications and Networks (IC3N) (yearly since 1993). [15] Special Issue on Protocol Engineering, Computer Communications, 19 (14) (1996).