home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.ai.neural-nets
- Path: sparky!uunet!destroyer!ubc-cs!alberta!arms
- From: arms@cs.UAlberta.CA (Bill Armstrong)
- Subject: Re: Perceptrons
- Message-ID: <arms.713731475@spedden>
- Sender: news@cs.UAlberta.CA (News Administrator)
- Nntp-Posting-Host: spedden.cs.ualberta.ca
- Organization: University of Alberta, Edmonton, Canada
- References: <10157@baird.cs.strath.ac.uk>
- Date: Thu, 13 Aug 1992 18:44:35 GMT
- Lines: 28
-
- robert@cs.strath.ac.uk (Robert B Lambert) writes:
-
- >To the NN community in general.
- >To what extent to the objections raised in perceptrons carry over to ANNs
- >other than BP? (In particular self-organizing NNs, recurrent NNs, ALNs).
- > ... I believe ANNs only have a future if all the principal objections
- >raised in Perceptrons are addressed.
-
- Since ALNs are special cases of multilayer perceptrons, of course they
- can't have any greater theoretical computing power than the usual
- multilayer perceptrons, whether used recurrently or not. On the other
- hand, with appropriate encodings of real numbers, ALNs can approximate
- at least any continuous function (like those output by an ordinary
- multilayer perceptron, say). The two thus have equivalent power to
- realize mappings in the finite digital domain.
-
- In"Perceptrons", Minsky and Papert don't place emphasis on speed of
- learning or execution, an attribute that may be considered an
- advantage of one or the other type of net in a given practical
- situation. They were concerned about the limits of what can be
- computed. The backpropagation and ALN training procedures have made
- it possible to train multilayer nets since they wrote the book, but
- that doesn't give the resulting nets any more computing power.
- --
- ***************************************************
- Prof. William W. Armstrong, Computing Science Dept.
- University of Alberta; Edmonton, Alberta, Canada T6G 2H1
- arms@cs.ualberta.ca Tel(403)492 2374 FAX 492 1071
-