home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!know!hri.com!ukma!darwin.sura.net!paladin.american.edu!auvm!VAXF.COLORADO.EDU!POWERS_W
- From: POWERS_W%FLC@VAXF.COLORADO.EDU (William T. Powers)
- Newsgroups: bit.listserv.csg-l
- Subject: CONNECTIONISM AND hpct
- Message-ID: <01GQSJN2CGN600O5V1@VAXF.COLORADO.EDU>
- Date: 5 Nov 92 19:31:27 GMT
- Sender: "Control Systems Group Network (CSGnet)" <CSG-L@UIUCVMD.BITNET>
- Lines: 68
- Comments: Gated by NETNEWS@AUVM.AMERICAN.EDU
- X-Envelope-to: CSG-L@vmd.cso.uiuc.edu
- X-VMS-To: @CSG
- MIME-version: 1.0
- Content-transfer-encoding: 7BIT
-
-
- [From Bill Powers (921104.2100)];
-
- Martin Taylor (921104.1340) --
-
- >Connectionism depends at root on the distribution of
- >representation, not on whether some elements have a continuous
- >rather than a discrete range of variation. Your version of HPCT
- >isolates the responsibility for the control of particular percepts
- >to particular ECSs. We would not do that, but would share the
- >responsibility through overlapping quasi-modular groups of ECSs.
- >Your version is to connectionist HPCT as classical AI is to
- >connectionist classifiers. And it leads to related kinds of
- >problem.
-
- The difference you see depends on the dimension you're attending to.
- For me, the difference between AI and connectionism is that
- connectionist models let the signals representing variables be the
- important thing, where in AI all variables had to be converted first
- into symbols (words, mainly) before they could be operated upon, and
- then the operations were carried out by rule-driven algorithms for
- symbol manipulation instead of by computing devices that handle
- signals directly. To me, that's the difference between analogue and
- digital computing.
-
- To represent a system with fixed properties, it doesn't matter
- whether you use a distributed network or an equivalent set of
- individual funtions. BCP, p. 39:
-
- "It is convenient to think of the brain as a collection of localized
- functions, and of neural signals as occurring in definite pathways
- linking functions together. The model, however, will not be
- invalidated if these elements prove some day to be distributed over
- large volumes of the brain. The organizational properties of this
- model do not depend on its geometrical properties."
-
- If we want to account for the way these functions come into being,
- then the network representations will probably be necessary. During
- maturation, axons grow in ways that depend on what the system is
- doing, and even after maturation is complete, synapses appear, change
- their properties, and disappear.
-
- I think that such networks have limited scope; the size of a sensory
- or motor nucleus. On a larger scale there is clearly an architecture
- composed of separate modules and separate layers. The types of
- neurones are different in different modules of the brain; the brain
- is not just one huge network composed of identical elements operating
- by identical principles.
-
- The connectionist models I have seen do not impress me as much as
- they impress their inventors. A great deal of subjective
- interpretation is involved in saying that a network "classifies" its
- inputs, or even that it "recognizes" a form. What these networks
- actually do is a lot simpler than that: given a set of inputs, they
- will produce certain outputs over a range of the inputs. To label
- this process "classification" implies first that one already knows what
- classification is, and second that there is no other kind of
- perceptual operation of any importance. I reject both implications as
- unwarranted, the first because there has been no careful
- investigation of the elements of perception (of the kind I have tried
- to develop) and the second because it is obvious, at least to me,
- that a great deal more than classification goes on in perception. I
- think that connectionists are trying to accomplish in one jump what
- the real perceptual system does stage by stage. This may be possible,
- to some extent, but this sort of modeling will necessarily fail to
- account for perceptions of both lower and higher levels than
- classifications: perception of motion, for example, or perception of
- principles.
-