home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Monster Media 1993 #2
/
Image.iso
/
text
/
9305nni.zip
/
930501.PPR
< prev
next >
Wrap
Internet Message Format
|
1993-05-01
|
4KB
From ml-connectionists-request@q.cs.cmu.edu Sat May 1 03:33:43 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA29693; Sat, 1 May 93 05:33:31 -0500
Received: by Q.CS.CMU.EDU id ab21628; 29 Apr 93 18:33:04 EDT
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id ab21078; 29 Apr 93 15:51:23 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id ab20917;
29 Apr 93 14:52:30 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa07279;
29 Apr 93 14:50:36 EDT
Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa09324;
29 Apr 93 14:37:24 EDT
Received: from zingo.nj.nec.com by EDRC.CMU.EDU id aa00487;
29 Apr 93 14:36:54 EDT
Received: by zingo.nj.nec.com (920330.SGI/YDL1.4-910307.16)
id AA23123(zingo.nj.nec.com); Thu, 29 Apr 93 14:22:00 -0400
Received: by fuzzy (5.52/cliff's joyful mailer #2)
id AA11990(fuzzy); Thu, 29 Apr 93 14:21:59 EDT
Date: Thu, 29 Apr 93 14:21:59 EDT
From: Lee Giles <giles@research.nj.nec.com>
Message-Id: <9304291821.AA11990@fuzzy>
To: connectionists@cs.cmu.edu
Subject: Reprint: Pruning Recurrent Neural Networks for Improved Generalization
Performance
Status: R
The following reprint is available via the NEC Research
Institute ftp archive external.nj.nec.com. Instructions for
retrieval from the archive follow the abstract summary. Comments
and remarks are always appreciated.
----------------------------------------------------------------------------------
"Pruning Recurrent Neural Networks
for Improved Generalization Performance"
Christian W. Omlin(a,c) and C. Lee Giles(a,b)
(a) NEC Research Institute, 4 Independence Way, Princeton, NJ 08540
(b) Institute for Advanced Computer Studies, U. of Maryland, College Park, MD 20742
(c) Computer Science Department, Rensselaer Polytechnic Institute, Troy, NY 12180
Determining the architecture of neural networks is an important issue for any
learning task. No general methods exist that allow us to estimate for recurrent neural
networks the number of layers of hidden neurons, the size of layers or the number of
weights. We present a simple heuristic which significantly improves the generalization
performance of recurrent networks. In addition if rules are extracted from networks
trained to recognize strings of regular languages, this pruning method permits rules
to be extracted which are more consistent with the rules to be learned. The performance
improvement is achieved by pruning and retraining networks. Our simulation results
for non-trivial grammars show that our simple method is effective. The performance
improvement is superior to the improvement obtained by training with weight decay.
Revised Technical Report No. 93-6, April 1993, Computer Science Department,
Rensselaer Polytechnic Institute, Troy, N.Y.
-------------------------------------------------------------------------------------
FTP INSTRUCTIONS
unix> ftp external.nj.nec.com (138.15.10.100)
Name: anonymous
Password: (your_userid@your_site)
ftp> cd pub/giles/papers
ftp> binary
ftp> get prune.ps.Z
ftp> quit
unix> uncompress prune.ps.Z
----------------------------------------------------------------------------------------
--
C. Lee Giles / NEC Research Institute / 4 Independence Way
Princeton, NJ 08540 / 609-951-2642 / Fax 2482
==