home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!spool.mu.edu!sdd.hp.com!cs.utexas.edu!sun-barr!ames!saimiri.primate.wisc.edu!usenet.coe.montana.edu!news.u.washington.edu!ogicse!das-news.harvard.edu!cantaloupe.srv.cs.cmu.edu!crabapple.srv.cs.cmu.edu!news
- From: sef@sef1.slisp.cs.cmu.edu
- Newsgroups: comp.ai.neural-nets
- Subject: Re: Questions about sigmoids etc.
- Message-ID: <Bz419p.MDL.1@cs.cmu.edu>
- Date: 11 Dec 92 19:39:19 GMT
- Article-I.D.: cs.Bz419p.MDL.1
- Sender: news@cs.cmu.edu (Usenet News System)
- Organization: School of Computer Science, Carnegie Mellon
- Lines: 25
- Nntp-Posting-Host: sef1.slisp.cs.cmu.edu
-
-
- From: crwth@Merlin.DoCS.UU.SE (Olle Gallmo)
-
- > For backpropagation networks (i.e. Rumelhart ,Mclelland and Williams), it
- > is neccessary to have a monotonically increasing, DIFFERENTIABLE function as
- > the output
-
- Differentiable, yes, but to my knowledge there is nothing in the algorithm that
- requires the transfer function to be monotonically increasing.
-
- Right. I frequently use Gaussian activation functions for hidden units in
- Cascor nets (and reported this in the original cascor paper). If you use
- something more complex such as Bessel functions, there's the problem of
- multiple minima or maxima where you can get hung up.
-
- -- Scott
-
- ===========================================================================
- Scott E. Fahlman Internet: sef+@cs.cmu.edu
- Senior Research Scientist Phone: 412 268-2575
- School of Computer Science Fax: 412 681-5739
- Carnegie Mellon University Latitude: 40:26:33 N
- 5000 Forbes Avenue Longitude: 79:56:48 W
- Pittsburgh, PA 15213
- ===========================================================================
-