Back-propagation algorithm with controlled oscillation of weights

Yolanda M. Pirez, Dilip Sarkar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Error Back Propagation (EBP) is now the most used training algorithm for Feedforward Artificial Neural Networks (FFANNs). However, it is very slow if it does converge, specially if the network size is not too large compared to the problem in hand. In this work we propose certain simple but effective modifications to the EBP algorithm to improve its convergence. The methods proposed here watch for oscillation of weights as the training algorithm proceeds. When such an oscillation is observed, the learning rate for only that weight is temporarily reduced. We have found a few novel and effective methods for temporary reduction of learning rate for individual oscillating weight. To study the performance of our modified learning algorithm we have done extensive simulation with one complex learning problem. The simulation results favorably compare our modified algorithms. Our modified EBP algorithms, like EBP algorithm, monoton-ically reduce the-sum-of-the-squared error as the training proceeds.

Original languageEnglish (US)
Title of host publication1993 IEEE International Conference on Neural Networks, ICNN 1993
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages21-26
Number of pages6
Volume1993-January
ISBN (Electronic)0780309995
DOIs
StatePublished - Jan 1 1993
EventIEEE International Conference on Neural Networks, ICNN 1993 - San Francisco, United States
Duration: Mar 28 1993Apr 1 1993

Other

OtherIEEE International Conference on Neural Networks, ICNN 1993
CountryUnited States
CitySan Francisco
Period3/28/934/1/93

Fingerprint

Backpropagation algorithms
Backpropagation
Learning algorithms
Neural networks

ASJC Scopus subject areas

  • Software

Cite this

Pirez, Y. M., & Sarkar, D. (1993). Back-propagation algorithm with controlled oscillation of weights. In 1993 IEEE International Conference on Neural Networks, ICNN 1993 (Vol. 1993-January, pp. 21-26). [298537] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICNN.1993.298537

Back-propagation algorithm with controlled oscillation of weights. / Pirez, Yolanda M.; Sarkar, Dilip.

1993 IEEE International Conference on Neural Networks, ICNN 1993. Vol. 1993-January Institute of Electrical and Electronics Engineers Inc., 1993. p. 21-26 298537.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Pirez, YM & Sarkar, D 1993, Back-propagation algorithm with controlled oscillation of weights. in 1993 IEEE International Conference on Neural Networks, ICNN 1993. vol. 1993-January, 298537, Institute of Electrical and Electronics Engineers Inc., pp. 21-26, IEEE International Conference on Neural Networks, ICNN 1993, San Francisco, United States, 3/28/93. https://doi.org/10.1109/ICNN.1993.298537
Pirez YM, Sarkar D. Back-propagation algorithm with controlled oscillation of weights. In 1993 IEEE International Conference on Neural Networks, ICNN 1993. Vol. 1993-January. Institute of Electrical and Electronics Engineers Inc. 1993. p. 21-26. 298537 https://doi.org/10.1109/ICNN.1993.298537
Pirez, Yolanda M. ; Sarkar, Dilip. / Back-propagation algorithm with controlled oscillation of weights. 1993 IEEE International Conference on Neural Networks, ICNN 1993. Vol. 1993-January Institute of Electrical and Electronics Engineers Inc., 1993. pp. 21-26
@inproceedings{b9f37416c36545e695a1ad287ba456c6,
title = "Back-propagation algorithm with controlled oscillation of weights",
abstract = "Error Back Propagation (EBP) is now the most used training algorithm for Feedforward Artificial Neural Networks (FFANNs). However, it is very slow if it does converge, specially if the network size is not too large compared to the problem in hand. In this work we propose certain simple but effective modifications to the EBP algorithm to improve its convergence. The methods proposed here watch for oscillation of weights as the training algorithm proceeds. When such an oscillation is observed, the learning rate for only that weight is temporarily reduced. We have found a few novel and effective methods for temporary reduction of learning rate for individual oscillating weight. To study the performance of our modified learning algorithm we have done extensive simulation with one complex learning problem. The simulation results favorably compare our modified algorithms. Our modified EBP algorithms, like EBP algorithm, monoton-ically reduce the-sum-of-the-squared error as the training proceeds.",
author = "Pirez, {Yolanda M.} and Dilip Sarkar",
year = "1993",
month = "1",
day = "1",
doi = "10.1109/ICNN.1993.298537",
language = "English (US)",
volume = "1993-January",
pages = "21--26",
booktitle = "1993 IEEE International Conference on Neural Networks, ICNN 1993",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Back-propagation algorithm with controlled oscillation of weights

AU - Pirez, Yolanda M.

AU - Sarkar, Dilip

PY - 1993/1/1

Y1 - 1993/1/1

N2 - Error Back Propagation (EBP) is now the most used training algorithm for Feedforward Artificial Neural Networks (FFANNs). However, it is very slow if it does converge, specially if the network size is not too large compared to the problem in hand. In this work we propose certain simple but effective modifications to the EBP algorithm to improve its convergence. The methods proposed here watch for oscillation of weights as the training algorithm proceeds. When such an oscillation is observed, the learning rate for only that weight is temporarily reduced. We have found a few novel and effective methods for temporary reduction of learning rate for individual oscillating weight. To study the performance of our modified learning algorithm we have done extensive simulation with one complex learning problem. The simulation results favorably compare our modified algorithms. Our modified EBP algorithms, like EBP algorithm, monoton-ically reduce the-sum-of-the-squared error as the training proceeds.

AB - Error Back Propagation (EBP) is now the most used training algorithm for Feedforward Artificial Neural Networks (FFANNs). However, it is very slow if it does converge, specially if the network size is not too large compared to the problem in hand. In this work we propose certain simple but effective modifications to the EBP algorithm to improve its convergence. The methods proposed here watch for oscillation of weights as the training algorithm proceeds. When such an oscillation is observed, the learning rate for only that weight is temporarily reduced. We have found a few novel and effective methods for temporary reduction of learning rate for individual oscillating weight. To study the performance of our modified learning algorithm we have done extensive simulation with one complex learning problem. The simulation results favorably compare our modified algorithms. Our modified EBP algorithms, like EBP algorithm, monoton-ically reduce the-sum-of-the-squared error as the training proceeds.

UR - http://www.scopus.com/inward/record.url?scp=84943278019&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84943278019&partnerID=8YFLogxK

U2 - 10.1109/ICNN.1993.298537

DO - 10.1109/ICNN.1993.298537

M3 - Conference contribution

AN - SCOPUS:84943278019

VL - 1993-January

SP - 21

EP - 26

BT - 1993 IEEE International Conference on Neural Networks, ICNN 1993

PB - Institute of Electrical and Electronics Engineers Inc.

ER -