Backpropagation: Theory, architectures, applications

Backpropagation: Theory, architectures, applications

358 Book Reviews As already mentioned, a substantial portion of the book is devoted to fuzzy neurocomputation. Chapter 3 introduces the basic constr...

136KB Sizes 1 Downloads 95 Views

358

Book Reviews

As already mentioned, a substantial portion of the book is devoted to fuzzy neurocomputation. Chapter 3 introduces the basic constructs (processing units) of logic neurons and discusses various architectures including those of logic processors. Subsequently, Chapter 4 entitled “Fuzzy Neurocomputing” provides the reader with a wealth of interesting and diverse problems formulated in the framework of fuzzy neural networks: optimal vector quantization, fuzzy computational memories, decomposition of relations, to name a few. Some further examples of fuzzy neural networks are covered in the subsequent chapters, like fuzzy flip-flops, Petri nets and learning processes, fuzzy controller. The book treats the introductory material on fuzzy sets in a different way than most (if not all) other books in the fuzzy sets area by organizing all pertinent fundamentals in the series of three appendices (A. Fuzq Sets- Notions, Operations; B. Relation and Fuzzy Relations, Fuzzy Relational Equations, Extension Principle; C. Fuzzy Sets and Probability). While somewhat unusual, this is an ideal way of exposing the reader to the necessary fundamentals of fuzzy sets while simultaneously avoiding a lot of the esoteric content not essential to the fundamentals or applications of fuzzy sets. In fact, to gain a good understanding what fuzzy sets are all about and how they work, it is usually enough to go through a brief introduction to the subject, if it is as carefully prepared and well thought as the one presented in this book. In summary, the book is an outstanding and highly valuable contribution to fuzzy neural networks it is strongly recommended to any reader interested in this fascinating realm of knowledge-based neurocomputation. On the whole the versatile format of the book makes it an ideal textbook or a reference source. Rrzysztof Cios

Backpropagation: Theory, Architectures, Applications, by Yves Chauvin and David E. Rumelhart (eds). Lawrence Erlbaum, Hillsdale, NJ, Hove, UK, 1995. ISBN 0-8058-1258-X, pp. 561. In 1986 McClelland and Rumelhart published the famous PDP book. Its chapter eight on backpropagation made history inspiring a wave of interest and research in neural networks (NN). Almost a decade later Chauvin and Rumelhart, acting both as editors and contributors, are publishing a volume devoted to backpropagation. This time there is no breakthrough. The goal of the book is much more moderate. As set forth in the preface it is a progress report on theory, architecture, and applications of backpropagation. The book consists of 15 chapters, in fact a selection of articles prepared by various authors. The first chapter is an introduction. The remaining chapters vary as to their contents and the scope of discussion and can be roughly classified into three categories. Chapters in the first category discuss a single application like phoneme recognition (chapter 21, flare phase control of a landing aircraft (chapter 3), representation of a finite state environment (chapter ll), or fingerprint matching (chapter 14). Chapters in the second category present specific algorithms or architectures and illustrate them using one or several case studies. Papers in this category present the following issues: recurrent backpropagation networks (chapter 4), focused backpropagation algorithm (chapter 5), networks for nonlinear control (chapter 6), distal supervised learning (chapter 71, simple recurrent networks (chapter 9), use of self organization as a first phase in the learning process (chapter 10). Finally, there is a third category of chapters which I would call theoretical. These chapters either present a generalized view on backpropagation, like chapter 8 (Backpropagation: comments and variations) and chapter 15 (Unified perspective on gradient descent learning), or are devoted to a specific theoretical issues, like chapter 12 that presents a mathematical analysis of learning in linear networks, or chapter 13 devoted to computational complexity of learning in recurrent networks. Most of the articles making the volume have never been published before. However, there are also chapters that are either modifications of earlier publications (chapters 9, 12 and 15) or just reprints (chapters 2 and 8) of previous publications. The chapters present more or less the same level of

Book Reviews

359

discussion. An exception is chapter 2 where the authors concentrating their discussion around secondary technical details forgot to give the reader a deeper and broader view of the ideas behind the adopted solution (TDNN). One of the usual flaws of books being selections of articles is their notational and terminological incoherence. Unfortunately, in this respect the book is not an exception. Obviously this is not a major issue but definitely a nuisance. Also a nuisance are numerous typing errors, especially those found in mathematical formulas. Another, more serious flaw emerges when you look at the references found at the end of each chapter. Hardly any of them goes beyond 1991! As the progress in NN did not stop in 1991 one can hardly avoid the impression that this progress report is somewhat outdated. To many readers ploughing through the book will be a challenge. The preface lists two types of potential readers: “ ...students in the field of Artificial Neural Networks ..“, and “ ...professionals who are looking for concrete applications ...“. This might be somewhat misleading if you do not add that in order to understand and benefit from the book an excellent mathematical and neural networks background is a must. Even so, reading of at least some of the chapters (like chapter 3) may still require additional study in the area of the discussed subject matter. If you think that this is not a problem you will definitely find the book an interesting and a valuable source of knowledge, ideas, and inspiration. After all the authors are recognized experts in the field. Dr. Jacek Witaszek Warsaw Technical University