Are modified back-propagation algorithms worth the effort?

D. Alpsan, M. Towsey, Ozcan Ozdamar, A. Tsoi, D. N. Ghista

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Scopus citations


A wide range of modifications and extensions to the backpropagation (BP) algorithm have been tested on a real world medical problem. Our results show that 1) proper tuning of learning parameters of standard BP not only increases the speed of learning (which is well known) but also has a significant effect on generalisation; 2) parameter combinations and training options which lead to fast learning do not usually yield good generalisation and vice versa; 3) standard BP may be fast enough when its parameters are finely tuned; 4) modifications developed on artificial problems for faster learning do not necessarily give faster learning on real-world problems, and when they do, it may be at the expense of generalisation; 5) even when modified BP algorithms perform well, they may require extensive fine-tuning to achieve this performance. For our problem, none of the modifications could justify the effort to implement them.

Original languageEnglish
Title of host publicationIEEE International Conference on Neural Networks - Conference Proceedings
Place of PublicationPiscataway, NJ, United States
Number of pages5
StatePublished - Dec 1 1994
Externally publishedYes
EventProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) - Orlando, FL, USA
Duration: Jun 27 1994Jun 29 1994


OtherProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7)
CityOrlando, FL, USA

ASJC Scopus subject areas

  • Software


Dive into the research topics of 'Are modified back-propagation algorithms worth the effort?'. Together they form a unique fingerprint.

Cite this