Back-propagation algorithm with controlled oscillation of weights

Yolanda M. Pirez, Dilip Sarkar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations


Error Back Propagation (EBP) is now the most used training algorithm for Feedforward Artificial Neural Networks (FFANNs). However, it is very slow if it does converge, specially if the network size is not too large compared to the problem in hand. In this work we propose certain simple but effective modifications to the EBP algorithm to improve its convergence. The methods proposed here watch for oscillation of weights as the training algorithm proceeds. When such an oscillation is observed, the learning rate for only that weight is temporarily reduced. We have found a few novel and effective methods for temporary reduction of learning rate for individual oscillating weight. To study the performance of our modified learning algorithm we have done extensive simulation with one complex learning problem. The simulation results favorably compare our modified algorithms. Our modified EBP algorithms, like EBP algorithm, monoton-ically reduce the-sum-of-the-squared error as the training proceeds.

Original languageEnglish (US)
Title of host publication1993 IEEE International Conference on Neural Networks, ICNN 1993
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Electronic)0780309995
StatePublished - Jan 1 1993
EventIEEE International Conference on Neural Networks, ICNN 1993 - San Francisco, United States
Duration: Mar 28 1993Apr 1 1993


OtherIEEE International Conference on Neural Networks, ICNN 1993
Country/TerritoryUnited States
CitySan Francisco

ASJC Scopus subject areas

  • Software


Dive into the research topics of 'Back-propagation algorithm with controlled oscillation of weights'. Together they form a unique fingerprint.

Cite this