home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!mcsun!uknet!strath-cs!robert@cs.strath.ac.uk
- From: robert@cs.strath.ac.uk (Robert B Lambert)
- Newsgroups: comp.ai.neural-nets
- Subject: Perceptrons
- Message-ID: <10157@baird.cs.strath.ac.uk>
- Date: 13 Aug 92 09:09:48 GMT
- Sender: robert@cs.strath.ac.uk
- Organization: Comp. Sci. Dept., Strathclyde Univ., Glasgow, Scotland.
- Lines: 37
-
- To the NN community in general.
-
- I finally got around to reading "Perceptrons" (by Minsky and Papert - 3rd
- edition) and was dismayed at how few of their principal objections have even
- been addresses, never mind solved.
-
- I was under the mistaken impression that BP overcame the limitations of the
- single layer perceptron. Reading the book, I realized that BP is actually
- more limiting and harder to use than its Nth order single layer cousin.
-
- As far as I can see, the objections related to computational complexity such
- as scale vs. training time (if it learns at all) have not even been tackled
- by the NN community.
-
- Is this right? If you know of work or are involved in research which
- demonstrates the applicability of NNs to "real-world" and not toy problems,
- please e-mail me (references would be appreciated).
-
- To what extent to the objections raised in perceptrons carry over to ANNs
- other than BP? (In particular self-organizing NNs, recurrent NNs, ALNs). Again,
- if you know of or are working on ANNs which you believe are practical to real
- world problems (eg. scalable without an exponential rise in training time),
- please e-mail me.
-
- I will post a summary of responses as this is a very important issue in NN
- research. I believe ANNs only have a future if all the principal objections
- raised in Perceptrons are addressed.
-
- Thanks in advance,
-
- Robert.
-
- ------------------------------------------------------------------------------
- Robert B Lambert E-Mail robert@cs.strath.ac.uk
- Dept. Computer Science,
- University of Strathclyde,
- Glasgow, Scotland.
-