Copyright 0 IFAC Artificial Intelligence in ReaI-Time Control, Arizona, USA, 1998
INITIAL WEIGHTS AND TRAINING OF TIME SERIES NEURAL NETWORK IN BO PREDICTION SYSTEM *PeiFeng Hao, XinHe Xu and DingYu Xue Control and Simulation Research Centre Northeastern University Shenyang 110006, P.R. China Email:
[email protected]
Extended Abstract In steelmaking industry, a breakout (simply BO) event associated with a casting is very destructive. In recent years neural network (NN) technology has been used to predict the BO event in some institutes and steelmaking companies and some progress had been reported. An existing prediction system in BAO Steel Company in China has had some problems in the training of a time series network, e.g., convergence speed relative to training time, manipulation, and so on. In this paper four particular problems are extensively investigated: 1) The use of simulated annealing algorithm, a power factor 'theta' is introduced in the sigmoidal function such that f(x) = 111 +E ar
which can be used to speed up the training process. 2) The initial assignment to the time series network, the method mentioned in thesis [8]. The initial weighting matrices Wp Bp B2 are assigned randomly, and the weighting matrix W 2 in the second layer is analytically evaluated. Using this method the number of training cycles of BP neural network could be significantly reduced, and some better local or global minima of the BP neural network could be obtained. The algorithm can be implemented on a computer easily. 3) The manipulation of the sample space. Some samples with same characteristics were deleted and only a few such samples are used in training; 4) The convergence of the training processes in the time series neural network. Two programs were developed for the training process, one is written in C and the other in Matlab. Two kinds of Sigmoid functions are used in particular, one was Logsig function and the other was Tansig function. In time series neural network construction, all of above were proved to be successful and some good results have been obtained. The real system is satisfactory.
Reference [1] Lodewyk EA. Wessels and Etienne Barnard, "Avoiding false Local Minima by Proper Initialization of Connections." IEEE Trans. on Neural Networks, 3(6):899-905, 1992 [2] Jaeques de Villiers and Etienne Barnard, "Backpropagation Neural nets with One and Two Hidden Layers." IEEE Trans. on Neural Networks, 4(1),136-141,1992
32
[3] Roberto Battiti, "Using Mutual Infonnation for Selecting Features in Supervised Neural Learning." IEEE Trans. on Neural Networks, 5(4), 537-550, 1994 [4] L.F.A.Wessels, E.Barnard and E. Van Rooyen, "The Physical Correlates of Local Minima," in proc. Int Neural Network Conf. (Paris) , July 1990. [5] D.E.Rumelhart, G.E.Hinton, and R. J. Williams, "Learning internal Representations by error propagation," in Parallel Distributed Processing, Cambridge MA: MIT press, 1986, ch. 8, PP,318-362. [6] Hao Peifeng, XU Xinhe, "The Number of Hidden Nodes and the Initial Weights of BP Neural Network", Proceeding of '961nternational Conference on High $\&$ New Technology and Traditional industry, Dandong China, 1996.7 [7] Hao Peifeng, Huang Taishan, Xu Xinhe, "The Study on Both BP Algorithm and OLL Algorithm", Proceeding of ASCC, Seoul, Korea, 1997.7 [8] Hao Peifeng, PHD thesis, "The Application Study with Neural Network in BO Prediction System Construction" , 1997.3
33