home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.ai.neural-nets
- Path: sparky!uunet!spool.mu.edu!yale.edu!ira.uka.de!Germany.EU.net!infko!inews
- From: evol@infko.uni-koblenz.de
- Subject: Technical Report on many bp speed up techniques available
- Message-ID: <1993Jan07.061015.9222@infko.uucp>
- Sender: inews@infko.uucp (inews)
- Organization: University of Koblenz, Germany
- Date: Thu, 07 Jan 1993 06:10:15 GMT
- Lines: 40
-
- Our new Technical Report "Optimization of the Backpropagation Algorithm
- for Training Multilayer Perceptrons" is now available via ftp:
-
-
- ftp archive.cis.ohio-state.edu or ftp 128.146.8.52
- cd pub/neuroprose
- binary
- get schiff.bp_speedup.ps.Z
- quit
-
- The report is an overview of many different backprop speedup techniques.
- 15 different algorithms are described in detail and compared by using
- a big, very hard to solve, practical data set. Learning speed and network
- classification performance with respect to the training data set and also
- with respect to a testing data set are discussed.
- These are the tested algorithms:
-
- backprop
- backprop (batch mode)
- backprop + Learning rate calculated by Eaton and Oliver's formula
- backprop + decreasing learning rate (Darken and Moody)
- backprop + Learning rate adaptation for each training pattern (J. Schmidhuber)
- backprop + evolutionarily learning rate adaptation (R. Salomon)
- backprop + angle driven learning rate adaptation(Chan and Fallside)
- Polak-Ribiere + line search (Kramer and Vincentelli)
- Conj. gradient + line search (Leonard and Kramer)
- backprop + learning rate adaptation by sign changes (Silva and Almeida)
- SuperSAB (T. Tollenaere)
- Delta-Bar-Delta (Jacobs)
- RPROP (Riedmiller and Braun)
- Quickprop (Fahlman)
- Cascade correlation (Fahlman)
-
-
-
- Randolf Werner
-
-
-
-
-