home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!dtix!relay!afterlife!hcbarth
- From: hcbarth@afterlife.ncsc.mil (Bart Bartholomew)
- Newsgroups: comp.ai.neural-nets
- Subject: BEP question
- Message-ID: <1992Sep7.064241.10764@afterlife.ncsc.mil>
- Date: 7 Sep 92 06:42:41 GMT
- Organization: The Great Beyond
- Lines: 27
-
-
- I have been pondering a problem in NNs that I think should
- work, but I don't know how to implement it. While I will describe
- a specific problem, the technique (if there is one) should have
- a fairly wide use.
- Everyone is just born knowing that sine waves are mean zero,
- variance one. If gaussian noise is added, the variance goes up.
- Some people use the variance as a measure of the signal-to-noise ratio.
- Seems like we should be able to do an epochal forward pass
- in which we accumulate the output points, compute the variance,
- develope an error term from the putative norm, and backpropagate that.
- One can, of course, see other such error terms. In effect,
- we have some black box in the forward pass to develop the error for
- the backward pass.
- The mathematicians (I am most assuredly *not*) will probably
- object if the error term is not differentiable. My wetware feeling
- (notoriously unreliable) is that if a given function is not nicely
- differential, there is some reasonable transformation of it that is.
- At this point we come to the inevitable clash with the nuts-
- and-bolts of the problem in deciding how to actually implement
- something like this.
- Your comments please.
- Bart
- --
- If there's one thing I just can't stand, it's intolerance.
- *No One* is responsible for my views, I'm a committee. Please do not
- infer that which I do not imply. hcbarth@afterlife.ncsc.mil
-