Relaxation constants in neural nets: Theoretical results
RELAXATION CONSTANTS IN NEURAL NETS: THEORETICAL RESULTS. E. Barnard and D. C~asent. Center for Excellence in Optical Data Proce~ing, Department of El...
RELAXATION CONSTANTS IN NEURAL NETS: THEORETICAL RESULTS. E. Barnard and D. C~asent. Center for Excellence in Optical Data Proce~ing, Department of Electrical and Computer Engineering, Carnegie Mellon University, Pittsburgh, PA 15213 USA. Many neural paradigms involve at least one relaxation constant- e.g. the "learning" and "momentum" constants of Backward Error Propagation, or the step size constant used for gradient descent in the Hopfield model. These constants are usually chosen heuristically, and neural modelters have made a variety of suggestions on what their values should be. We extend the standard analysis of the convergence properties of gradient descent to obtain some theoretical additional insight into the role of these constants. This allows us to make new suggestions concerning the proper way of choosing them. Our analysis is investigated by way of simulation; we find that an appreciable increase in performance results from a proper choice of relaxation constants.