home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!cis.ohio-state.edu!pacific.mps.ohio-state.edu!linac!uwm.edu!caen!sol.ctr.columbia.edu!ira.uka.de!uka!prechelt
- From: prechelt@i41s14.ira.uka.de (Lutz Prechelt)
- Newsgroups: comp.ai.neural-nets
- Subject: changes to "FAQ in comp.ai.neural-nets" -- monthly posting
- Supersedes: <nn.changes.posting_704427486@i41s14.ira.uka.de>
- Followup-To: comp.ai.neural-nets
- Date: 28 Aug 1992 02:20:32 GMT
- Organization: University of Karlsruhe, Germany
- Lines: 182
- Expires: 2 Oct 1992 02:18:13 GMT
- Message-ID: <nn.changes.posting_714968293@i41s14.ira.uka.de>
- Reply-To: prechelt@ira.uka.de (Lutz Prechelt)
- NNTP-Posting-Host: i41s14.ira.uka.de
- Keywords: modifications, new, additions, deletions
- Originator: prechelt@i41s14
-
- *** nn.oldbody Tue Jul 28 04:18:07 1992
- --- nn.body Fri Aug 7 16:37:07 1992
- ***************
- *** 1,5 ****
-
- Archive-name: neural-net-faq
- ! Last-modified: 92/07/13
-
- (FAQ means "Frequently Asked Questions")
- --- 1,5 ----
-
- Archive-name: neural-net-faq
- ! Last-modified: 92/08/07
-
- (FAQ means "Frequently Asked Questions")
- ***************
- *** 128,132 ****
- Such a summary should be announced in the original posting of the question
- or request with a phrase like
- ! "Please email, I'll summarize"
-
- In such a case answers should NOT be posted to the newsgroup but instead
- --- 128,132 ----
- Such a summary should be announced in the original posting of the question
- or request with a phrase like
- ! "Please answer by email, I'll summarize"
-
- In such a case answers should NOT be posted to the newsgroup but instead
- ***************
- *** 201,212 ****
- and weights; only local information; highly parallel operation ]
-
- A vague description is as follows:
- - A NN is a network of many very simple processors ("units"), each with a
- - (small amount of) local memory, who are connected by unidirectional
- - communication channels ("connections"), which carry numeric (as
- - opposed to symbolic) data.
- - The units operate only on their local data and on the inputs
- - they receive via the connections.
-
- The design motivation is what distinguishes neural networks from other
- mathematical techniques:
- --- 201,219 ----
- and weights; only local information; highly parallel operation ]
-
- + First of all, when we are talking about a neural network, we *should*
- + usually better say "artificial neural network" (ANN), because that is
- + what we mean most of the time. Biological neural networks are much
- + more complicated in their elementary structures than the mathematical
- + models we use for ANNs.
- +
- A vague description is as follows:
-
- + An ANN is a network of many very simple processors ("units"), each
- + possibly having a (small amount of) local memory. The units are
- + connected by unidirectional communication channels ("connections"),
- + which carry numeric (as opposed to symbolic) data. The units operate
- + only on their local data and on the inputs they receive via the
- + connections.
- +
- The design motivation is what distinguishes neural networks from other
- mathematical techniques:
- ***************
- *** 286,295 ****
- in layers) feedforward (i.e., the arcs joining nodes are
- unidirectional, and there are no cycles) nets.
- - Back-propagation assumes knowledge of the right answer from a teacher and
- - uses gradient descent on the error (provided by the teacher) to train the
- - weights.
- - The activation function is (usually) a sigmoidal (i.e., bounded above and
- - below, but differentiable) function of a weighted sum of the nodes inputs.
-
- The use of a gradient descent algorithm to train its weights makes it
- slow to train; but being a feedforward algorithm, it is quite rapid during
- --- 293,303 ----
- in layers) feedforward (i.e., the arcs joining nodes are
- unidirectional, and there are no cycles) nets.
-
- + Back-propagation needs a teacher that knows the correct output for any
- + input ("supervised learning") and uses gradient descent on the error
- + (as provided by the teacher) to train the weights. The activation
- + function is (usually) a sigmoidal (i.e., bounded above and below, but
- + differentiable) function of a weighted sum of the nodes inputs.
- +
- The use of a gradient descent algorithm to train its weights makes it
- slow to train; but being a feedforward algorithm, it is quite rapid during
- ***************
- *** 301,305 ****
- Microstructure of Cognition (volume 1, pp 318-362).
- The MIT Press.
- ! (this is the classic one) or one of the some thousand other books
- or articles on backpropagation :->
-
- --- 309,313 ----
- Microstructure of Cognition (volume 1, pp 318-362).
- The MIT Press.
- ! (this is the classic one) or one of the dozens of other books
- or articles on backpropagation :->
-
- ***************
- *** 458,461 ****
- --- 466,479 ----
- understand".
-
- + McClelland, J. L. and Rumelhart, D. E. (1988).
- + Explorations in Parallel Distributed Processing: Computational Models of
- + Cognition and Perception (software manual). The MIT Press.
- + Comments: "Written in a tutorial style, and includes 2 diskettes of NN
- + simulation programs that can be compiled on MS-DOS or Unix (and they do
- + too !)"; "The programs are pretty reasonable as an introduction to some
- + of the things that NNs can do."; "There are *two* editions of this book.
- + One comes with disks for the IBM PC, the other comes with disks for the
- + Macintosh".
- +
- McCord Nelson, M. and Illingworth, W.T. (1990). A Practical Guide to Neural
- Nets. Addison-Wesley Publishing Company, Inc. (ISBN 0-201-52376-0).
- ***************
- *** 560,573 ****
- applications implementation".
-
- - McClelland, J. L. and Rumelhart, D. E. (1988).
- - Explorations in Parallel Distributed Processing: Computational Models of
- - Cognition and Perception (software manual). The MIT Press.
- - Comments: "Written in a tutorial style, and includes 2 diskettes of NN
- - simulation programs that can be compiled on MS-DOS or Unix (and they do
- - too !)"; "The programs are pretty reasonable as an introduction to some
- - of the things that nns can do."; "There are *two* editions of this book.
- - One comes with disks for the IBM PC, the other comes with disks for the
- - Macintosh".
- -
- Pao, Y. H. (1989). Adaptive Pattern Recognition and Neural Networks
- Addison-Wesley Publishing Company, Inc. (ISBN 0-201-12584-6)
- --- 578,581 ----
- ***************
- *** 973,976 ****
- --- 981,987 ----
- 2. Usenet groups comp.ai.neural-nets (Oha ! :-> )
- and comp.theory.self-org-sys
- + There is a periodic posting on comp.ai.neural-nets sent by
- + srctran@world.std.com (Gregory Aharonian) about Neural Network
- + patents.
-
- 3. Central Neural System Electronic Bulletin Board
- ***************
- *** 1016,1020 ****
- (source code:) rcs_v4.2.justsrc.tar.Z (1.4 MB)
-
- -
- 2. UCLA-SFINX
- ftp 131.179.16.6 (retina.cs.ucla.edu)
- --- 1027,1030 ----
- ***************
- *** 1158,1162 ****
- Models, Programs, and Exercises" by McClelland and Rumelhart.
- MIT Press, 1988.
- ! This book is often referred to as PDP vol III which is a very
- misleading practice! The book comes with software on an IBM disk but
- includes a makefile for compiling on UNIX systems. The version of
- --- 1168,1172 ----
- Models, Programs, and Exercises" by McClelland and Rumelhart.
- MIT Press, 1988.
- ! Comment: "This book is often referred to as PDP vol III which is a very
- misleading practice! The book comes with software on an IBM disk but
- includes a makefile for compiling on UNIX systems. The version of
- ***************
- *** 1163,1167 ****
- PDP available at nic.funet.fi seems identical to the one with the book
- except for a bug in bp.c which occurs when you try to run a script of
- ! PDP commands using the DO command. This can be found and fixed easily.
-
- 16. Xerion
- --- 1173,1177 ----
- PDP available at nic.funet.fi seems identical to the one with the book
- except for a bug in bp.c which occurs when you try to run a script of
- ! PDP commands using the DO command. This can be found and fixed easily."
-
- 16. Xerion
- --
- Lutz Prechelt (email: prechelt@ira.uka.de) | Whenever you
- Institut fuer Programmstrukturen und Datenorganisation | complicate things,
- Universitaet Karlsruhe; D-7500 Karlsruhe 1; Germany | they get
- (Voice: ++49/721/608-4317, FAX: ++49/721/694092) | less simple.
-