Neurocomputing 2 ( 1990/91 ) 131-133 Elsevier
131
Conference report International Joint Conference on Neural N e t w o r k s - IJCNN 90 June 17-21, 1990, San Diego, CA, USA R.C.
Johnson
Cognizer Connection, 333 S. State Street, Suite V 141, Portland, OR 97034, USA
The International Joint Conference on Neural Networks in Summer 1990 witnessed new microchips, new systems and a hybrid approach that enlists the help of fuzzy logic. Perhaps the most interesting development at the conference was a new microchip architecture from Adaptive Solutions Inc. (Beaverton, OR). This chip set is the first general-purpose microprocessor for neural applications. Adaptive Solutions has taken an all-digital approach to simulating any neural-network model with downloadable microcode. The two-chip set runs at 12.8 billion connections per second (CPS) using one-bit synaptic weights and at 1.6 billion CPS using 16-bit weights. It learns at from 250 million connection updates per second (CUPS) to 300M CUPS, about 100 times faster than current neural-accelerator boards. Adaptive Solutions uses a virtual node concept to accomplish on-chip learning. A single layer of 64 neurons is time-division multiplexed among the various layers in a particular neural architecture. The user's chosen learning method is downloaded into a writable control store on the microscquenccr chip. That one chip controls any number of processing chips by issuing 31-bit wide microcode to them. Each neuron has 4 kybtes of 0925-2312/91/$03.50 © 1991 - Elsevier Science Publishers B.V.
m e m o r y yielding a total of 2M 1-bit connections, 256k 8-bit connections or 128k 16-bit connections for each processing chip. Adaptive Solutions is using a standard CMOS fabrication technology, but will not reveal the identity of the company manufacturing the chips for it (although both Mitsubishi and Sharp have resident engineers in its offices, reportedly working on 'applications' of the chip set).
Accelerator cards At the conference several new neural-network accelerators boards were introduced. Neurocomputing Ltd. showed its Astracards for the PC. For the Macintosh there was a system from Ornicon Corp., for Compaq there was one from Ford Aerospace Corp. Neurocomputing Ltd.'s (Alresford, England) PC-based boards were developed in conjunction with Cambridge University. The resulting line of Astracards acts as neural-network accelerator boards for AT-bus PCs. The neural accelerators process from 16 to 33 million CPS in feedforward or 3- to 5-million CUPS while learning.
132
R.C. Johnson
The Astracard works with Nestor's software, NeuralWare's NeuralWorks, Olmstead & Watkins C-language libraries and Neurun - a graphical backpropagation network development environment. Astracards can also be linked to traditional expert system tool-kits as a prototyping aid or for final deployment. So far there are two Astracards, one using TI's 320C30 digital-signal processor (DSP) and one using Intel's i860 DSP. They are also working on a board using Neural Semiconductor's microchips. The TI-based card achieves 33 M F L O P S while Intel's tops at 66 MFLOPS. Both plug into the A T bus and run under MS-DOS, Windows or Unix. The TI-based card processes 16 million CPS or while learning 3 million CUPS. Its total capacity is 3 million neurons and connects. The Intel-based card can process 33 million CPS (5 CUPS) and control a total net size of 8 million neurons and connections. Both Astracards support Microsoft's Fortran and C-language compilers. Both can loan their microchip memories to the host PC, for RAMdisks or m e m o r y expansion, when they are not being used for neural simulations. Ornicon Corp. (San Diego, CA) offered its S o n n e t - a System for Operating Neural Nets. Sonnet is based on Mac II host with neural- and signal-processing cards plugged into it. Ornicon recommends the system for signal detection and classification, industrial inspection, diagnostic imaging, reconnaissance, surveillance and process control. Sonnet can perform pattern classification using a realtime front-end signal preprocessing and feature extractor plus a back-end neural network simulation also running in realtime. A graphical editor and on-screen electronic notebook helps define network topology and direct the learning process. The training method used is backpropagation of errors with user-variable training rates that can be used like simulated annealing. Training can terminate after a certain number of passes or when a mean-squared level has been reached. After training Sonnet can classify incoming signals in realtime issuing alarms when it
detects certain patterns. Systems designed with Sonnet can also be accelerated further with Neural Semiconductor's microchips. The base system comes with a signal processing board sporting a 30 kHz sample rate. It can perform preprocessing tasks, such as from 64- to 8092-point FFTs, continuously in realtime. It can accept traditional analog inputs as well as digital inputs from a file, T C P / I P network or RS-422 port. Ford Aerospace Corp. (Houston, TX) offered its Neural Emulation Tool ( N E T ) , a hardware simulator based on a Compaq host computer with Texas Instrument's TMS320C30 digital signal processor (DSP) on boards in its backplane. The five-slot backplane can be populated with from 5 to 17 DSPs. The system architecture provides for both private and common memory areas for easy construction of neural simulations. In operation the Ford neural simulator runs at 147 million CPS when doing backpropagation of errors in feedforward mode with four processor boards in parallel. A single board can achieve 40 million CPS doing backpropagation of errors in feedforward mode. In training one board learns at 4.2 million CUPS for a processor board learning a large backpropagation network. Four processor boards learn at 15.6 million CUPS. The Compaq host can be used foreground processing (wordprocessor, spreadsheet, or other work) while a large network is training in the background. Ford supplies the source code for a backpropagation of errors simulation. It also supports other methods including counterpropagation, adaptive resonance (ART-1 and -2), a feature map classifier and the Barto Arp with spatio-temporal networks planned, A C-compiler is also available for its boards.
Fuzzy logic At the conference, one company was demonstrating a hybrid system using both neural net-
Con lorence report
works and fuzzy logic. Togai InfraLogic Inc. (Irvine, CA) used the fuzzy adaptive m e m o r y ( F A M ) concept invented by University of Southern California's Professor Bart Kosko. Geometrically, a fuzzy set maps to a point in the unit hypercube. The corners of the hypercube are the crisp points of traditional logic, whereas the points inside the cube correspond to fuzzy sets. Each dimension of a given hypercube corresponds to a p a r a m e t e r of the problem. Usually, a problem has a given input space and an output space, defined geometrically as two fuzzy system hypercubes. A given fuzzy system is a mapping between these hypercubes. F A M s contain the maps between these fuzzy hypercubes. The F A M operates like a trained neural network in the sense that it maps points between an input space and an output space with a parallel operating network. But a neural network learns its mapping from example inputoutput pairs, whereas a fuzzy system is specified manually. Kosko has modified the F A M ' s transformation maps to enlist the help of a neural learning method. In effect, he can reverse-engineer a fuzzy system's rules with a neural network frontend. F A M s solve tlhe neural training dilemma - too little training and a neural network is unreliable when dealing with items outside its training set, but too much training and it might as well be a look-up table of every possible input-output pair. F A M s learn in one pass over the training data, but not like a look-up table. Rather like a set of weighted principles that can be inspected and perfected one by one. For control applications, h u m a n or automatic controllers generate a typical stream of inputoutput data. Adaptive FAMs convert this data to weighted F A M rules using a neural network learning method. The fuzzy system learns which control inputs cause which control outputs and the n u m b e r of examples of each transformation in a training data set determines the weight accorded to each F A M rule.
133
Fuzzy only At the conference, HyperLogic Corp. (Escondido, CA) showed a decision system for naturallanguage users that uses fuzzy logic. Like a fuzzy expert-system shell, CubiCalc allows a set of rules about an application to be written and then evaluated against realtime inputs, but in ordinary language rather than by a knowledge engineer. HyperLogic, founded by Fred Watkins, thc principal in the Olmstead and Watkms neural network company, designed Cubicalc for ordinary users. It functions as a user-programmable gcnctic application to all appearances. But unlike other such generic applications like spreadsheets, fuzzy logic can be used to specify approximate rules-of-thumb rather than specific I'ormulae. A CubiCalc system consists of a set of fuzzy rules about an application written in a language customized for the specific application. Fo create one, first you break an application into variables each with a set of adjective modiliers about the rangc of values they may take on. You specify the variables that can be measured by typing in their names and ranges in a lill-inthe-blank screen. Then you define the adjectives, which are the fuzzy categories into which the variables will be cast. Each range gradually blends into the adjacent one with a uscr-specitiable speed. After the variables and adjectives are defined you enter as many rules about the situation as you know. Any n u m b e r of such rules can be entered and simultaneously c o m p a r e d against the current situation in realtime. A CubiCalc fuz:,v system gets its inputs from disk files of historical data. such as reading the past m o v e m e n t of stock indices (useful for fine-tuning rules), or from realtime inputs from the outside world, such as t e m p e r a t u r e sensors. As a precursor to using live data, a simulation m o d e is provided so that an artilicial environment can test out a newly formed rule ,,et. In the artificial environment, all the variables of the real environment are modeled over their full ranges in the fuzzy system.