home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.ai.neural-nets
- Path: sparky!uunet!zaphod.mps.ohio-state.edu!cs.utexas.edu!news.uta.edu!news.uta.edu!lindahl
- From: lindahl@cse.arl.utexas.edu (Charlie Lindahl)
- Subject: Demos for training & analysis of mapping NNs from UTexas@Arlington
- Message-ID: <LINDAHL.92Nov15225532@cse.arl.utexas.edu>
- Sender: news@utagraph.uta.edu (USENET News System)
- Nntp-Posting-Host: cse.uta.edu
- Organization: Computer Science Engineering Univ. of Texas at Arlington
- Distribution: comp
- Date: Mon, 16 Nov 1992 04:55:32 GMT
- Lines: 92
-
- All:
-
- I'm posting demos for an Electrical Engineering professor here
- at UTexas Arlington (Dr. Mike Manry). Below is the description for the
- demos related to MAPPING. The ABSTRACT file appears below.
-
- I've posted on me.uta.edu (anon ftp) under /pub/neural/uta-nn-ee.tar
- (this is a TAR file containing ZIP files, and hence should be transferred
- IN BINARY MODE. Interested parties can EMAIL me as to why this file wouldn't
- COMPRESS).
-
- Download procedure:
-
- 1) Use ANONYMOUS FTP in BINARY mode.
- 2) Use the UNIX TAR program to unpack into two ZIP files.
- 3) Use the PKUNZIP PC program to unpack onto the PC.
-
- PLEASE don't send EMAIL directly to me, or post on this bboard (as
- I don't frequent this regularly); rather, send EMAIL directly to
- Dr. Manry at B496MTM@UTARLG.UTA.EDU.
-
- Charlie Lindahl
- Electrical Engineering Dept
- University of Texas at Arlington
- EMAIL : lindahl@cse.uta.edu
- ------------------------------------------------------------------------
- File Neumap.zip Size; 206,746 bytes
-
- Five demo programs are included, for the training and analysis of
- multilayer perceptron (MLP) neural networks for mapping. Such
- networks have sigmoidal hidden units and linear activations in
- the output units. The enclosed programs are demos in that the
- training data file must have exactly 300 training patterns, and
- only 4 or fewer inputs are allowed.
-
- Program BPmap designs MLP mapping networks using backpropagation
- (BP). Nonstandard features in this program are (1) batching of weight
- changes is allowed, (2) The learning factor is adaptive, and
- (3) the importance of each input feature is calculated. Mean-squared
- error (MSE) is calculated and printed for each iteration. The
- Relative Root Mean-squared error (RRMSE) is calculated and printed
- after the data has been processed.
-
- Program Inmap designs MLPs using a fast technique. Networks are
- often better than those from BPmap and can be designed one to two
- orders of magnitude faster. The RRMSE is calculated and printed
- after the data has been processed.
-
- Program Modc analyzes MLP networks trained via BPmap or Inmap.
- Less useful units can be pruned under user control.
- Program Test3 applies a trained MLP to a data file, and writes the
- output layer signals and the desired outputs to a user-chosen disk
- file. The RRMSE is calculated and printed after the data has been
- processed.
-
- Program Watecon converts the weight file from Inmap or BPmap to
- an easily understood formatted form, or vice versa. This should
- allow users to process networks from other neural net design
- packages, and vice versa.
-
-
-
-
-
- --
- ============================================================================
- "If Unix is the answer, surely | Charlie S. Lindahl
- we've forgotten the question." - | lindahl@cse.uta.edu
- Anonymous | Electrical Engineering Dept
- | University of Texas at Arlington
- ----------------------------------------------------------------------------
- ____/ _/ _/ ___/ / _/____/ _/ ____/ _/_____/
- _/ _/ _/ _/ _/ _/ _/ _/ _/ _/
- _/ _/___/_/ _/___/_/ _/____/ _/ _/ _/_____/
- _/ _/ _/ _/ _/ _/ _/ _/ _/ _/
- _____/ / _/ / _/ / _/ _/____/ ____/ _/______/
- ============================================================================
- Disclaimer: If my employer shares these views, I'd be most surprised.
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-