home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.ai.neural-nets
- Path: sparky!uunet!zaphod.mps.ohio-state.edu!cs.utexas.edu!news.uta.edu!news.uta.edu!lindahl
- From: lindahl@cse.arl.utexas.edu (Charlie Lindahl)
- Subject: Demos for training & analysis of classification NNs (UTexas@Arlington)
- Message-ID: <LINDAHL.92Nov15225747@cse.arl.utexas.edu>
- Sender: news@utagraph.uta.edu (USENET News System)
- Nntp-Posting-Host: cse.uta.edu
- Organization: Computer Science Engineering Univ. of Texas at Arlington
- Distribution: comp
- Date: Mon, 16 Nov 1992 04:57:47 GMT
- Lines: 82
-
- All:
-
- I'm posting demos for an Electrical Engineering professor here
- at UTexas Arlington (Dr. Mike Manry). Below is the description for the
- demos related to CLASSIFICATION problems. The ABSTRACT file appears below.
-
- I've posted on me.uta.edu (anon ftp) under /pub/neural/uta-nn-ee.tar
- (this is a TAR file containing ZIP files, and hence should be transferred
- IN BINARY MODE. Interested parties can EMAIL me as to why this file wouldn't
- COMPRESS).
-
- Download procedure:
-
- 1) Use ANONYMOUS FTP in BINARY mode.
- 2) Use the UNIX TAR program to unpack into two ZIP files.
- 3) Use the PKUNZIP PC program to unpack onto the PC.
-
- PLEASE don't send EMAIL directly to me, or post on this bboard (as
- I don't frequent this regularly); rather, send EMAIL directly to
- Dr. Manry at B496MTM@UTARLG.UTA.EDU.
-
- Charlie Lindahl
- Electrical Engineering Dept
- University of Texas at Arlington
- EMAIL : lindahl@cse.uta.edu
- ------------------------------------------------------------------------
- Neucls.zip 123,824 bytes
-
- Four demo programs are included, for the training and analysis of
- multilayer perceptron (MLP) neural networks for classification. Such
- networks have sigmoidal hidden units sigmoidal output units. The
- enclosed programs are demos in that the training data file must
- have 4 or fewer inputs or 16 inputs, 800 or 16 or fewer training
- patterns, and 4 or fewer outputs.
-
- Program BP designs MLP classification networks using backpropagation
- (BP). Nonstandard features in this program are (1) batching of weight
- changes is allowed, (2) The learning factor is adaptive, and
- (3) the importance of each input feature is calculated. The
- Mean-squared training error (MSE) and classification error percentage
- are calculated and printed for each iteration.
-
- Program In2 designs MLPs for classification using a fast technique.
- Networks are often better than those from BP and can be designed
- one to two orders of magnitude faster. The classification error
- percentage and MSE are calculated and printed for each iteration.
-
- Program Mod analyzes MLP networks trained via BP or In2. Less
- useful units can be pruned under user control.
-
- Program Watecon converts the weight file from Inmap or BPmap to
- an easily understood formatted form, or vice versa. This should
- allow users to process networks from other neural net design
- packages, and vice versa.
- --
- ============================================================================
- "If Unix is the answer, surely | Charlie S. Lindahl
- we've forgotten the question." - | lindahl@cse.uta.edu
- Anonymous | Electrical Engineering Dept
- | University of Texas at Arlington
- ----------------------------------------------------------------------------
- ____/ _/ _/ ___/ / _/____/ _/ ____/ _/_____/
- _/ _/ _/ _/ _/ _/ _/ _/ _/ _/
- _/ _/___/_/ _/___/_/ _/____/ _/ _/ _/_____/
- _/ _/ _/ _/ _/ _/ _/ _/ _/ _/
- _____/ / _/ / _/ / _/ _/____/ ____/ _/______/
- ============================================================================
- Disclaimer: If my employer shares these views, I'd be most surprised.
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-