home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!cs.utexas.edu!swrinde!zaphod.mps.ohio-state.edu!mips!darwin.sura.net!jvnc.net!yale.edu!ira.uka.de!chx400!ugun2b!ugun2a!osherson
- From: osherson@uni2a.unige.ch
- Newsgroups: comp.ai.neural-nets
- Subject: Re: What's in a layer ? (was: dumb question on layer enumeration)
- Keywords: layer counting, neuron, connection
- Message-ID: <1992Jul27.160527.1494@uni2a.unige.ch>
- Date: 27 Jul 92 14:05:27 GMT
- Reply-To: EFiesler@IDIAP.CH
- Followup-To: EFiesler@IDIAP.CH
- Organization: University of Geneva, Switzerland
- Lines: 87
-
- From: piccolbo@ghost.dsi.unimi.it (antonio piccolboni)
- >> osherson@uni2a.unige.ch writes:
-
- make that:
- >> EFiesler@IDIAP.CH writes:
- >>
- >>In general, there are three kinds of connections in layered neural networks:
- >>1. Interlayer connections: connecting neurons from adjacent layers,
- >>2. Intralayer connections: connecting neurons within the same layer
- >> (including self-connections), and
- >>3. Supralayer connections: connecting neurons from neither the same, nor
- >> adjacent layers; i.e. these connections "skip" at least one layer.
- >
- >I'd like to shift your attention to a more compelling problem (for me):
- >can we train a neural net with connections of the third type by means
- >of standard back-propagation?
-
- Yes. One can train the supralayer connections in a similar fashion as one
- trains interlayer connections. Just group connections together that "end"
- in the same neuron, independent of the topology.
-
- >In our experience this one works as far as
- >the connection relation ( (a,b) belong to R iff the weight of connection
- >from a to b is different from 0) is a directed acyclic graph.
- >Is there a formal proof of this? (I think it would consist in proving
- >that the partial derivatives of error with respect to connections weight
- >are spatially and timely local) Can we weaken the hypotesys allowing
- >recurrent connections? I know there are particular results (see Frasconi,
- >Gori, Soda: Local feedback multilayered networks, in Neural Computation
- >vol 4, 1992 pp.120-130), but what I'm looking for are necessary and sufficient
- >conditions for the gradient descent algorithm to be spatially and timely local.
- >
- >>P.S. My paper entitled "Neural Network Formalization" in the neuroprose
- >> (anonymous ftp 128.146.8.52; pub/neuroprose/fiesler.formalization.ps.Z)
- >> relates to this.
- >
- >Does this book relate to my problem too? Where can I look?
-
- Although it is certainly a potential chapter for a neural network text book,
- the current version is in the form of a paper / technical report. Since it
- deals with very fundamental matters, it most likely relates to almost everybody
- who is interested in neural networks.
- I apologize for the cryptic description of how to obtain the paper from
- the neuroprose archive. Try to follow these steps:
-
- unix> ftp archive.cis.ohio-state.edu (or: ftp 128.146.8.52)
- login: anonymous
- password: neuron
- ftp> cd pub/neuroprose
- ftp> binary
- ftp> get fiesler.formalization.ps.Z
- ftp> bye
- unix> zcat fiesler.formalization.ps.Z | lpr
- (or however you uncompress and print postscript)
-
- (Unfortunately, I will not be able to provide hard copies.)
-
- E. Fiesler
- IDIAP
- Case postale 609
- CH-1920 Martigny
- Switzerland / Suisse
- Tel.: +41-26-22-76-64
- Fax.: +41-26-22-78-18
- E-mail: EFiesler@IDIAP.CH (INTERNET)
-
-
-
- P.S. Because of the many requests, here are also the references:
-
- The full updated paper, has been published as a technical report:
-
- E. Fiesler
- Neural Network Formalization
- IDIAP Technical Report 92-01
- IDIAP, Martigny, Switzerland
- July 1992.
-
- A short version of the paper will be published next month:
-
- E. Fiesler and H. J. Caulfield
- Layer Based Neural Network Formalization
- Artificial Neural Networks II
- Editors: Igor Alexander and John G. Taylor
- Elsevier Science Publishers (North-Holland), Amsterdam, 1992.
-
- I should know the page numbers and other details by then.
-