Abstract
Error Back Propagation (EBP) is now the most used training algorithm for Feedforward Artificial Neural Networks (FFANNs). However, it is very slow if it does converge, specially if the network size is not too large compared to the problem in hand. In this work we propose certain simple but effective modifications to the EBP algorithm to improve its convergence. The methods proposed here watch for oscillation of weights as the training algorithm proceeds. When such an oscillation is observed, the learning rate for only that weight is temporarily reduced. We have found a few novel and effective methods for temporary reduction of learning rate for individual oscillating weight. To study the performance of our modified learning algorithm we have done extensive simulation with one complex learning problem. The simulation results favorably compare our modified algorithms. Our modified EBP algorithms, like EBP algorithm, monoton-ically reduce the-sum-of-the-squared error as the training proceeds.
Original language | English (US) |
---|---|
Title of host publication | 1993 IEEE International Conference on Neural Networks, ICNN 1993 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 21-26 |
Number of pages | 6 |
Volume | 1993-January |
ISBN (Electronic) | 0780309995 |
DOIs | |
State | Published - Jan 1 1993 |
Event | IEEE International Conference on Neural Networks, ICNN 1993 - San Francisco, United States Duration: Mar 28 1993 → Apr 1 1993 |
Other
Other | IEEE International Conference on Neural Networks, ICNN 1993 |
---|---|
Country/Territory | United States |
City | San Francisco |
Period | 3/28/93 → 4/1/93 |
ASJC Scopus subject areas
- Software