home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.ai.neural-nets
- Path: sparky!uunet!utcsri!torn!csd.unb.ca!morgan.ucs.mun.ca!raghu
- From: raghu@morgan.ucs.mun.ca (Raghu B)
- Subject: BACK PROPOGATION - DOUBTS?
- Message-ID: <1992Jul26.201803.17242@morgan.ucs.mun.ca>
- Keywords: bP
- Organization: Memorial University of Newfoundland
- Date: Sun, 26 Jul 1992 20:18:03 GMT
- Lines: 28
-
-
-
- I am using Back Propogation algorithm to train my net. I have
- chosen a random number between -1. and 1.0. I am giving a set of input
- datas (say 20). I trained that and and the neural output almost matched
- the desired output in the case (where the output lies between -1 to 1.)
- But in cases where the output is going above 1 my net fails to train
- them and it is giving a large error.
-
- I am giving 3 input values and I wanted 3 output values with
- 11 hidden layers. I am just using the basic back propogation algorith.
- I am not giving any bias values? Correct me if I am wrong. Does the
- bias input have any effect in shifting the results or does it give
- a faster convergence. Or does momentum parameter help in anyway
- to get convergence.
-
- I hope you could understand my question? If not do write
- back to me. I am not a hi-fi computer engineer. I am just a basic
- mechanical engineer and I could understand most of the terms in
- NN field.
-
- Expecting positive responses from dear netters.
-
-
- Raghu,Balasubramanian
-
- raghu@engr.mun.ca
-
-