home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.ai.neural-nets
- Path: sparky!uunet!math.fu-berlin.de!news.uni-stuttgart.de!rz.uni-karlsruhe.de!stepsun.uni-kl.de!uklirb!koblenz!gummert
- From: gummert@koblenz.informatik.uni-kl.de (Jochem Gummert - Hiwi Ritter)
- Subject: Question: Force a constraint into a NN
- Message-ID: <gummert.715605545@koblenz>
- Sender: news@uklirb.informatik.uni-kl.de (Unix-News-System)
- Nntp-Posting-Host: koblenz.informatik.uni-kl.de
- Organization: University of Kaiserslautern, Germany
- Date: Fri, 4 Sep 1992 11:19:05 GMT
- Lines: 22
-
-
- I want to design a NN that satisfies the following constraint (for any input):
-
- The number of activated neurons in the input layer equals
- the number of activated neurons in the output layer.
-
-
- I tried an ordinary backprop-NN (15 input neurons, 15 output neurons ,15 hidden
- neurons). The net was trained with about 30 patterns (which, of course, satis-
- fied the constraint).
- Unfortunately, but not surprisingly, not all testing data behaved correct,
- in reference to the constraint: Some data activated too many, some data
- activated too less output neurons.
-
- How, if at all, can I force the net to satisfy this constraint?
-
- Any ideas, hints, references will be greatly appreciated.
- I'll summarize my email replies.
-
- Many thanks in advance.
-
- Jochen Gummert (gummert@informatik.uni-kl.de)
-