home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!pipex!unipalm!uknet!mcsun!corton!enst!ulysse!erato!mosnier
- From: mosnier@erato.enst.fr (Francois Mosnier)
- Newsgroups: comp.ai.neural-nets
- Subject: back-prop versus signal-processing
- Keywords: neural-nets - trigonometry - back-propagation
- Message-ID: <2535@ulysse.enst.fr>
- Date: 15 Sep 92 18:42:36 GMT
- Sender: news@ulysse.enst.fr
- Organization: Telecom Paris, France
- Lines: 36
-
- A friend of mine is trying to teach a neural network via back propagation, a very
- easy problem : it is how to convert w (rotational speed) and v (axial speed) to
- Delta-X and Delta-Y, the mean Delta-t being given.
-
- In fact, he's meeting (I hope it won't last) a lot of problems, and he's worrying
- about the best activation functions. cos() and sin() could be a natural(1) idea.
- But it does not work better than tanh() - i.e. very bad.
-
- - Is there someone having idea on this subject ?
-
- - Is there a good reason why a neural-net with a single neuron, with a
- a linear activation function CANNOT learn addition ?
-
- - Are NN only good at sweeping out the noise on a signal ?
-
-
- Thank's for all contributions.
-
- ____
- |\ /|
- | \/ |-
- | | Francois Mosnier
-
-
-
- (1) for a non-specialist, as I'am.
-
-
- +------------------------------------o-------------------------------------+
- | P-mail: | E-mail: |
- | 255 Av Daumesnil - Appt 211 | mosnier@inf.enst.fr |
- | 75012 Paris - France | mosnier%enst.fr@uunet.uu.net |
- | (1) 43 47 10 57 | |
- +------------------------------------o-------------------------------------+
-
-
-