home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!dtix!darwin.sura.net!ua1ix!aramis.cs.ua.edu!eduardo
- From: eduardo@aramis.cs.ua.edu (Eduardo Kortright)
- Newsgroups: comp.ai.neural-nets
- Subject: Non-binary output Back Prop
- Message-ID: <1992Jul31.180618.129705@ua1ix.ua.edu>
- Date: 31 Jul 92 18:06:18 GMT
- Sender: eduardo@cs.ua.edu (Eduardo Kortright)
- Organization: Department of Computer Science, University of Alabama
- Lines: 22
- Nntp-Posting-Host: aramis.cs.ua.edu
-
- I am currently trying to implement a BPN with outputs that do not
- necessarily range between 0 and 1. As the output function, I am using
- the straight weighted sum of inputs instead of passing the sum through
- a sigmoid function. Also, when backpropagating the error, I use the
- derivative of the output function (=1) instead of the derivative of
- the sigmoid. My problem is that unless the learning rate is
- extremely small, everything blows up (the numbers get so large that I
- get floating point overflows). The NN thus takes forever to converge.
-
- Is there anyone out there who has experience with this? I have
- temporarily gone back to using the sigmoid function and scaling the
- output to the proper ranges. If this question has been asked before
- or not of interest to the group, please feel free to e-mail:
-
- eduardo@cs.ua.edu
-
- Thanks in advance.
-
- =-= Eduardo =-= eduardo@cs.ua.edu =-=
- Eduardo Kortright // Grad student, University of Alabama at Tuscaloosa
- --
- =-= Eduardo =-= eduardo@cs.ua.edu =-=
-