Are modified back-propagation algorithms worth the effort?

D. Alpsan, M. Towsey, Ozcan Ozdamar, A. Tsoi, D. N. Ghista

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

A wide range of modifications and extensions to the backpropagation (BP) algorithm have been tested on a real world medical problem. Our results show that 1) proper tuning of learning parameters of standard BP not only increases the speed of learning (which is well known) but also has a significant effect on generalisation; 2) parameter combinations and training options which lead to fast learning do not usually yield good generalisation and vice versa; 3) standard BP may be fast enough when its parameters are finely tuned; 4) modifications developed on artificial problems for faster learning do not necessarily give faster learning on real-world problems, and when they do, it may be at the expense of generalisation; 5) even when modified BP algorithms perform well, they may require extensive fine-tuning to achieve this performance. For our problem, none of the modifications could justify the effort to implement them.

Original languageEnglish
Title of host publicationIEEE International Conference on Neural Networks - Conference Proceedings
Place of PublicationPiscataway, NJ, United States
PublisherIEEE
Pages567-571
Number of pages5
Volume1
StatePublished - Dec 1 1994
Externally publishedYes
EventProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) - Orlando, FL, USA
Duration: Jun 27 1994Jun 29 1994

Other

OtherProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7)
CityOrlando, FL, USA
Period6/27/946/29/94

Fingerprint

Backpropagation algorithms
Backpropagation
Tuning
Medical problems

ASJC Scopus subject areas

  • Software

Cite this

Alpsan, D., Towsey, M., Ozdamar, O., Tsoi, A., & Ghista, D. N. (1994). Are modified back-propagation algorithms worth the effort? In IEEE International Conference on Neural Networks - Conference Proceedings (Vol. 1, pp. 567-571). Piscataway, NJ, United States: IEEE.

Are modified back-propagation algorithms worth the effort? / Alpsan, D.; Towsey, M.; Ozdamar, Ozcan; Tsoi, A.; Ghista, D. N.

IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 1 Piscataway, NJ, United States : IEEE, 1994. p. 567-571.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Alpsan, D, Towsey, M, Ozdamar, O, Tsoi, A & Ghista, DN 1994, Are modified back-propagation algorithms worth the effort? in IEEE International Conference on Neural Networks - Conference Proceedings. vol. 1, IEEE, Piscataway, NJ, United States, pp. 567-571, Proceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7), Orlando, FL, USA, 6/27/94.
Alpsan D, Towsey M, Ozdamar O, Tsoi A, Ghista DN. Are modified back-propagation algorithms worth the effort? In IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 1. Piscataway, NJ, United States: IEEE. 1994. p. 567-571
Alpsan, D. ; Towsey, M. ; Ozdamar, Ozcan ; Tsoi, A. ; Ghista, D. N. / Are modified back-propagation algorithms worth the effort?. IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 1 Piscataway, NJ, United States : IEEE, 1994. pp. 567-571
@inproceedings{3258b821ef2a47178e18e81abea35f25,
title = "Are modified back-propagation algorithms worth the effort?",
abstract = "A wide range of modifications and extensions to the backpropagation (BP) algorithm have been tested on a real world medical problem. Our results show that 1) proper tuning of learning parameters of standard BP not only increases the speed of learning (which is well known) but also has a significant effect on generalisation; 2) parameter combinations and training options which lead to fast learning do not usually yield good generalisation and vice versa; 3) standard BP may be fast enough when its parameters are finely tuned; 4) modifications developed on artificial problems for faster learning do not necessarily give faster learning on real-world problems, and when they do, it may be at the expense of generalisation; 5) even when modified BP algorithms perform well, they may require extensive fine-tuning to achieve this performance. For our problem, none of the modifications could justify the effort to implement them.",
author = "D. Alpsan and M. Towsey and Ozcan Ozdamar and A. Tsoi and Ghista, {D. N.}",
year = "1994",
month = "12",
day = "1",
language = "English",
volume = "1",
pages = "567--571",
booktitle = "IEEE International Conference on Neural Networks - Conference Proceedings",
publisher = "IEEE",

}

TY - GEN

T1 - Are modified back-propagation algorithms worth the effort?

AU - Alpsan, D.

AU - Towsey, M.

AU - Ozdamar, Ozcan

AU - Tsoi, A.

AU - Ghista, D. N.

PY - 1994/12/1

Y1 - 1994/12/1

N2 - A wide range of modifications and extensions to the backpropagation (BP) algorithm have been tested on a real world medical problem. Our results show that 1) proper tuning of learning parameters of standard BP not only increases the speed of learning (which is well known) but also has a significant effect on generalisation; 2) parameter combinations and training options which lead to fast learning do not usually yield good generalisation and vice versa; 3) standard BP may be fast enough when its parameters are finely tuned; 4) modifications developed on artificial problems for faster learning do not necessarily give faster learning on real-world problems, and when they do, it may be at the expense of generalisation; 5) even when modified BP algorithms perform well, they may require extensive fine-tuning to achieve this performance. For our problem, none of the modifications could justify the effort to implement them.

AB - A wide range of modifications and extensions to the backpropagation (BP) algorithm have been tested on a real world medical problem. Our results show that 1) proper tuning of learning parameters of standard BP not only increases the speed of learning (which is well known) but also has a significant effect on generalisation; 2) parameter combinations and training options which lead to fast learning do not usually yield good generalisation and vice versa; 3) standard BP may be fast enough when its parameters are finely tuned; 4) modifications developed on artificial problems for faster learning do not necessarily give faster learning on real-world problems, and when they do, it may be at the expense of generalisation; 5) even when modified BP algorithms perform well, they may require extensive fine-tuning to achieve this performance. For our problem, none of the modifications could justify the effort to implement them.

UR - http://www.scopus.com/inward/record.url?scp=0028749925&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0028749925&partnerID=8YFLogxK

M3 - Conference contribution

VL - 1

SP - 567

EP - 571

BT - IEEE International Conference on Neural Networks - Conference Proceedings

PB - IEEE

CY - Piscataway, NJ, United States

ER -