home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.ai.neural-nets
- Path: sparky!uunet!usc!rpi!ghost.dsi.unimi.it!piccolbo
- From: piccolbo@ghost.dsi.unimi.it (antonio piccolboni)
- Subject: Re: What's in a layer ? (was: dumb question on layer enumeration)
- Keywords: layer counting, neuron, connection
- References: <1992Jul16.161325.1477@uni2a.unige.ch>
- Organization: Computer Science Dep. - Milan University
- Date: Fri, 24 Jul 1992 13:10:35 GMT
- Message-ID: <1992Jul24.131035.21406@ghost.dsi.unimi.it>
- Lines: 36
-
-
- osherson@uni2a.unige.ch writes:
-
- >In general, there are three kinds of connections in layered neural networks:
-
- >1. Interlayer connections: connecting neurons from adjacent layers,
- >2. Intralayer connections: connecting neurons within the same layer
- > (including self-connections), and
- >3. Supralayer connections: connecting neurons from neither the same, nor
- > adjacent layers; i.e. these connections "skip" at least one layer.
-
- I'd like to shift your attention to a more compelling problem (for me):
- can we train a neural net with connections of the third type by means
- of standard back-propagation? In our experience this one works as far as
- the connection relation ( (a,b) belong to R iff the weight of connection
- from a to b is different from 0) is a directed acyclic graph.
- Is there a formal proof of this? (I think it would consist in proving
- that the partial derivatives of error with respect to connections weight
- are spatially and timely local) Can we weaken the hypotesys allowing
- recurrent connections? I know there are particular results (see Frasconi,
- Gori, Soda: Local feedback multilayered networks, in Neural Computation
- vol 4, 1992 pp.120-130), but what I'm looking for are necessary and sufficient
- conditions for the gradient descent algorithm to be spatially and timely local.
-
- >P.S. My paper entitled "Neural Network Formalization" in the neuroprose
- > (anonymous ftp 128.146.8.52; pub/neuroprose/fiesler.formalization.ps.Z)
- > relates to this.
-
- Does this book relate to my problem too? Where can I look?
- Thank you in advance (promise of a summary and apologies for my english
- are too obvious to say)
-
- Antonio Piccolboni
-
-
-
-