1628
Book Reviews
Chapter 5 deals with application of learning automata to multimodal function optimization when observations are corrupted with noise. The use of a single stochastic automaton, and the hierarchical structure of learning automata, are considered. The accuracy can be increased by using a large number of control actions, or by using a hierarchical structure with more levels. In Chapter 6, several industrial applications of learning automata are presented. In this chapter, a learning automaton is considered as a stochastic automaton connected in a feedback loop with a random environment and with the performance evaluation unit. The following applications are discussed in some detail: multilevel learning control of a drying furnace, hierarchical learning control of an absorption column, learning control of an evaporator, adaptive choice of a cyclic code in communication systems, and application of a learning automaton to neuralnetwork synthesis. The book may be of interest to researchers and specialists in adaptive and learning optimization systems. It could be also of value to computer scientists with an interest in artificial intelligence. The material is presented, in all the chapters, in a predominantly formal mathematical way, which may restrict its readership to specialists interested in the theory.
Robust Control: The Parametric Approach, by S.P. BHATrACHARYYA, H. CHAPELLAT and L.H. KEEL. Prentice Hall Information and System Sciences Series; Prentice Hall; Upper Saddle River, NJ, USA; 1995; 648 pp.; $73; ISBN: 0-13-781576-X
Reviewed by: Tadeusz KACZOREK Warsaw University of Technology, Warsaw, Poland
The objective of this book is to present the parametric theory of robust control in a self-contained manner, using mathematical language. The book presents a unified, elegant approach to the robust stability theory of linear and nonlinear systems. The authors of the book are well known, and are very active in the field of robust control. They have contributed to developments of the generalized Kharitonov theorem, the theory of disk polynomials, the extremal properties of interval systems, the calculation of the real parametric stability margin, and design problems under simultaneous parametric and nonparametric uncertainty. The book consists of a preface, fifteen chapters and references. Chapter 0 presents some basic aspects of control systems, uncertainty models, robustness issues and a brief historical sketch of control theory. Chapter 1 provides a new look at classical stability
criteria for a single polynomial. A family of polynomials whose coefficients depend continuously on a set of parameters is considered. The boundary crossing theorem serves as the unifying idea for the entire subject of robust parametric stability. The Routh and Jury stability tests and the Hermite-Biehler theorem are derived. In Chapter 2, the stability of a line segment joining two fixed polynomials which are the endpoints is considered. The line segment of polynomials is a convex combination of the two endpoints. This kind of problem arises in robust control problems containing a single uncertain parameter. A complete analysis of this problem for both the Hurwitz and Schur cases is given, and the results are summarized as the segment lemma. The vertex lemma and the real and complex convex direction lemmas are also proved. In Chapter 3 the problem of determining the robust stability of a parametrized family of polynomials where the parameter is the set of polynomial coefficients is considered. Procedures are developed for the determination of maximal stability regions in the space of coefficients of a polynomial. The largest l2 stability ball centered at a given point in the space of coefficients of a polynomial is derived. Explicit formulas are developed for the Schur and Hurwitz cases. The graphical approach of Tsypkin and Polak for calculation of the largest l~' stability ball (for arbitrary p) in the coefficient space is given. The robust Hurwitz and Schur stability of a family of disc polynomials is also considered. In Chapter 4, the stability ball calculations developed in Chapter 3 are extended to accommodate interdependent perturbations among the polynomial coefficients. The radius of the largest stability ball in the space of real parameters is derived under the assumption that the uncertain parameters enter the characteristic polynomial coefficients linearly or affinely. Both ellipsoidal and polytopic uncertainty regions are considered. In the polytopic case, the stability testing property of the exposed edges, and some extremal properties of edges and vertices, are established. The graphical TsypkinPolak plot for stability margin calculation and the theory of linear disc polynomials are also presented. Chapter 5 is devoted to the robust stability of interval polynomial families, and the Kharitonov's theorem. The Kharitonov's theorem is interpreted as a generalization of the Hermite-Biehler interlacing theorem. An important extremal property of the Kharitonov polynomials is established, and an application of them to robust state feedback stabilization is given. The problem of Schur stability of interval polynomials is also considered. In Chapter 6 the edge theorem is stated and proved. This theorem allows us to constructively determine the root space of a family of linearly parametrized systems. The stability testing property of edges is also extended to nested polytopic families.
Book Reviews Chapter 7 deals with the Hurwitz stability of a family of polynomials that consist of a linear combination, with fixed polynomial coefficients, of interval polynomials, The generalised Kharitonov theorem provides a constructive solution to this problem by reducing it to the Hurwitz stability of a prescribed set of extremal line segments. It is shown that for a compensator to robustly stabilize the system, it is sufficient that it stabilizes a prescribed set of line segments in the plant parameter space. Under special conditions on the compensator, it suffices to stabilize the Kharitonov vertices. In Chapter 8, some extremal frequency-domain properties of linear interval control systems are developed. It is shown that the extremal segments possess boundary properties that are useful for generating the frequency-domain templates and the Nyquist, Bode and Nichols envelopes of linear interval systems. It is also proved that the worst-case gain, phase and parametric stability margins of control systems containing such a plant occur over this extremal set. Chapter 9 considers the robust stability and performance of control systems subjected to parametric uncertainty as well as unstructured perturbations. The parameter uncertainty is modelled through a linear interval system. Two types of unstructured uncertainty are considered, H- norm-bounded uncertainty and nonlinear sector-bounded perturbations. Robust versions of the small gain theorem and the absolute stability problem are presented. The Popov criterion and the circle criterion of nonlinear control systems are given. Chapters 10 and 11 deal with the robust stability of polynomials containing uncertain interval parameters that appear affine multilinearly in the coefficients. In Chapter 10 the mapping theorem is stated and proved. A computationally efficient solution to the robust stability problem is obtained by replacing the multilinear interval family with a test set consisting of a polytopic family. In Chapter 11, the results are developed on multilinear interval systems by an extension of the generalized Kharitonov theorem, and the frequency-domain properties discussed in Chapters 7, 8 and 9 are applied to the multilinear case. Chapter 12 deals with parameter perturbations in state-space models. The mapping theorem is used to give an efficient solution to the robust stability of state-space systems under real parametric interval uncertainty. Some techniques for the calculation of robust parametric stability regions using Lyapunov theory are presented. The calculation of the real and complex stability radius, defined in terms of the operator norm of a feedback matrix, and some results on the Schur stability of nonnegative interval matrices are also given. Chapter 13 is devoted to some synthesis techniques. It is shown that a minimum phase
1629
interval plant can always be stabilized by a stable controller, regardless of the magnitude of the perturbations. By means of examples, it is shown how the Riccati equation approach can be used to deal with parametric perturbations. Chapter 14 deals with interval modelling, identification and control. Some examples of interval identification and design, applied to two experimental space structures, are described as an application, demonstrating the practical use of the theory. In general, many well-selected examples illustrate the exposition of the theory. Proofs of the theorems are elegant, simple and insightful. The book is written with great clarity and expertise. It is addressed primarily to graduate students, but can also be recommended to practising engineers and applied scientists who are interested in robust control. The book makes an excellent fundamental contribution to the development of robust control theory.
Adaptive Filter Theory, by Simon HAYKIN. Prentice Hall; Upper Saddle River, NJ, USA; 1996; 989pp.; $84; ISBN: 0-13-322760-X
Reviewed by: Steve ROGERS Boeing Missiles and Space Division, Huntsville, AL, USA
Over the years, adaptive filtering techniques have been successfully applied to many applications, including: adaptive antennas, radar, sonar, seismology, biomedical engineering, and communications. With the advent of faster embedded processors, the use of advanced signal-processing techniques, including adaptive filtering, is becoming more common. Dr. Haykin, in this edition, has written the best text on adaptive filtering available today. His objectives are to present adaptive filtering in the context of finite impulse response (FIR), and to provide a discussion of neural networks applied to nonlinear adaptive filtering. This well-organized text is divided into four parts: background material (Chapters 1 to 4), linear optimum filters (Chapters 5 to 7), linear adaptive filters (Chapters 8 to 17), and nonlinear adaptive filters (Chapters 18 to 20). The background material reviews digital signal-processing design and analysis, as applied to adaptive filters. The linear optimum filter section covers Wiener filters, linear prediction, and Kaiman filters. The section on linear adaptive filters discusses least-mean-square (LMS) algorithms and recursive least-square (RLS) algorithms. The linear adaptive algorithms are presented from a timeand frequency-domain perspective. Wiener filtering is associated with the LMS algorithm, and Kalman