home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Monster Media 1993 #2
/
Image.iso
/
text
/
9305nni.zip
/
930520.PPR
< prev
next >
Wrap
Internet Message Format
|
1993-05-20
|
6KB
From ml-connectionists-request@q.cs.cmu.edu Thu May 20 04:54:14 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA08478; Thu, 20 May 93 06:54:06 -0500
Received: from Q.CS.CMU.EDU by q.cs.CMU.EDU id aa28213; 19 May 93 22:50:50 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id aa28200;
19 May 93 22:20:52 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa21633;
19 May 93 22:20:18 EDT
Received: from CS.CMU.EDU by B.GP.CS.CMU.EDU id aa02126; 19 May 93 16:42:27 EDT
Received: from max.ee.lsu.edu by CS.CMU.EDU id aa11084; 19 May 93 16:41:58 EDT
Received: by max.ee.lsu.edu (4.0/1.34)
id AA28051; Wed, 19 May 93 15:41:28 CDT
Date: Wed, 19 May 93 15:41:28 CDT
From: John Pastor <pastor@max.ee.lsu.edu>
Message-Id: <9305192041.AA28051@max.ee.lsu.edu>
To: connectionists@cs.cmu.edu
Status: R
The following technical report is now available. If you would
like to have a copy, please let me know.
------------------------------------------------------------------
Technical Report ECE/LSU 93-04
Another Alternative to Backpropagation:
A One Pass Classification Scheme for Use
with the Kak algorithm
John F. Pastor
Department of Electrical and Computer Engineering
Louisiana State University
Baton Rouge, La. 70803
April 26,1993
email: pastor@max.ee.lsu.edu
ABSTRACT
Kak[1] provides a new technique for designing, and training, a
feedforward neural network. Training with the Kak algorithm is much
faster, and is implemented much more easily, than with the backpropagation
algorithm[2]. The Kak algorithm calls for the construction of a network
with one hidden layer. Each hidden neuron classifies an input vector in
the training set that maps to a nonzero output vector. Kak[1] also
presents two classification algorithms. The first, CC1, provides
generalization comparable to backpropagation[2] but may require numerous
passes through the training set to classify one input vector. The second,
CC2, only requires inspection of the vector we wish to classify but does
not provide generalization. An extension of CC2 is suggested as a new
classification scheme that will classify an input vector with only one
pass through the training set yet will provide generalization. Simulation
results are presented that demonstrate that using the new classification
scheme not only signifigantly reduces training time, but provides better
generalization capabilities, than classifying with CC1. Thus, the Kak
algorithm, using this new classification scheme, is an even better
alternative to backpropagation.
From ml-connectionists-request@q.cs.cmu.edu Wed May 19 21:49:51 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA06173; Wed, 19 May 93 23:49:44 -0500
Received: from Q.CS.CMU.EDU by q.cs.CMU.EDU id ab28148; 19 May 93 22:29:10 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id ab28145;
19 May 93 22:03:55 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa21606;
19 May 93 22:03:14 EDT
Received: from CS.CMU.EDU by B.GP.CS.CMU.EDU id aa25789; 19 May 93 3:13:51 EDT
Received: from cssu28.ust.hk by CS.CMU.EDU id aa05799; 19 May 93 2:15:50 EDT
Received: by cssu28.cs.ust.hk (4.1/SMI-4.1)
id AA21563; Wed, 19 May 93 14:14:10 HKT
Date: Wed, 19 May 93 14:14:10 HKT
From: "Dr. Michael D. Stiber" <stiber@cs.ust.hk>
Message-Id: <9305190614.AA21563@cs.ust.hk>
To: Connectionists@cs.cmu.edu
Subject: Paper in neuroprose: Learning In Neural Models With Complex Dynamics
Ftp-Host: archive.cis.ohio-state.edu
Ftp-Filename: /pub/neuroprose/stiber.dynlearn.ps.Z
Status: R
The following preprint has been placed in the Neuroprose archives at
Ohio State (filename: stiber.dynlearn.ps.Z). If you cannot use FTP, I
can email the file to you.
"Learning In Neural Models With Complex Dynamics" (4 pages)
Michael Stiber
Department of Computer Science
The Hong Kong University of Science and Technology
Clear Water Bay, Kowloon, Hong Kong
stiber@cs.ust.hk
Jose P. Segundo
Department of Anatomy and Cell Biology
and Brain Research Institute
University of California
Los Angeles, California 90024, USA
iaqfjps@mvs.oac.ucla.edu
Abstract
Interest in the ANN field has recently focused on dynamical neural
{\em networks} for performing temporal operations, as more realistic
models of biological information processing, and to extend ANN
learning techniques. While this represents a step towards realism, it
is important to note that {\em individual} neurons are complex
dynamical systems, interacting through nonlinear, nonmonotonic
connections. The result is that the ANN concept of {\em learning},
even when applied to a single synaptic connection, is a nontrivial
subject.
Based on recent results from living and simulated neurons, a first
pass is made at clarifying this problem. We summarize how synaptic
changes in a 2-neuron, single synapse neural network can change system
behavior and how this constrains the type of modification scheme that
one might want to use for realistic neuron-like processors.
Dr. Michael Stiber stiber@cs.ust.hk
Department of Computer Science tel: (852) 358 6981
The Hong Kong University of Science & Technology fax: (852) 358 1477
Clear Water Bay, Kowloon, Hong Kong