Neural Net Lab 1.2


Welcome to the Neural Net Lab 1.2. This program was designed to give those without programming experience the ability to experiment with neural networks. It is primarily intended for use educational settings, perhaps in conjunction with a course on neural nets, though anyone with an interest in nets and a good text on the subject should be able to use the program to explore different types of networks. To get help, click on one of the topics below.

Navigating Neural Net Lab

Net Types

Activation Functions

Neural Net Lab Files

Running an Example

Troubleshooting

References

Acknowledgements

Registration

Licensing Information

Contacting Inner Star


Navigating Neural Net Lab

Operation of the Neural Net Lab is simple, but there are some things you should know to get started. Just type net and press ENTER to startthe program. If you want to run in graphics mode, type net /g and then press enter.

When the program first loads, you will be presented a menu of options. To select an option, press the ALT key and the highlighted (i.e., red) letter or click on the desired option with the mouse. Once you've selected an option, you will be presented with a drop-down menu of choices. Use the up and down arrow keys to highlight your choice and then press the ENTER key. Alternatively, you could press the highlighted letter in your choice or click it with the mouse.

In dialog boxes, use the TAB key to move forward through the items or SHIFT + TAB to move backward. The spacebar or ENTER selects dialog box buttons. In general, ALT + a highlighted letter selects that item. F1 brings up context sensitive help in most situations.

If you elect to train a new network (the train option) you will first be asked to select a net type. The program will then prompt you to supply the parameters for that net. Then you will be asked to provide the names of the files that contain the training data and names for the files in which the results will be stored. You can type in a filename or select one from the drop-down list; click the arrow next to a box to view the file list. The program will then train the network. When training is complete, you will be offered the option to test the net on a different data set (a good idea if possible, to see how well the network can generalize its knowledge). Then you will be given the opportunity to view and print the results file.

That's all there is to it! Well, almost. Further details can be found online in the rest of this help file or in the readme.doc file. If you've registered your copy of Neural Net Lab, loads of details can be found in the printed documentation.

Enjoy your exploring!


Return to beginning


Activation Functions

There are three activation functions implemented in Neural Net Lab: bipolar continuous, bipolar discrete, and unipolar continuous. For BPN, you can choose between bipolar and unipolar continuous functions. The bipolar function has the following form:

f(net) = 2/1+e(-lambda*net) - 1

The unipolar activation function has this form:

f(net) = 1/1+e(-lambda*net)

where net is the sum of the weighted inputs to the neuron and lambda is a parameter that determines the steepness of the function.

The third activation function is used in BAM and ADALINE nets and takes the following form:

f(net) = sgn(net)

This function returns +1 if the sum of the weighted inputs is positive, -1 if it is negative, and 0 if the input is 0.

There is one other "function" implemented in Neural Net Lab: in the ADALINE simulation you can choose to have the net function as a simple linear combiner - the output is the sum of the inputs.

Return to beginning


Neural Net Lab Files

The files written and read by Neural Net Lab are simple text files that can be manipulated in text editors such as MS-DOS's Edit or MS-Windows' Notepad. In general, there will be two input files and two output files for each net.

The input files contain the input patterns and the desired classifications for those patterns. Each line in the file represents 1 pattern and there is 1 element on each line for each neuron in the corresponding layer. The patterns within the input and classification files must be in the same order.

Example input and classification files have been provided with the Neural Net Lab. Input files have a .pat extension and classification files have a .des (for desired) extension. For example, the fourth pattern in the xor.pat file is 1 1. Note that the elements of the pattern are separated by a space. In the xor.des file, you can see that the desired classification for this pattern is 0.

Each net will generate 2 output files. The first, or results, file will list the parameters used in training the net, the files it was trained and tested on, and the results for each pattern in the data sets (actual and desired). The second file contains the weight matrices and can be used with the load menu option to reuse the net at a later time.

The input and output files can have any legal MS-DOS file names. If the files are not in the same directory as the Neural Net Lab you must supply the complete pathname. In the case of output files, if you specify the name of an existing file, you will be asked if you want to overwrite it. A single line in an input file cannot exceed 300 characters (including spaces) in length.

The following example files are provided with Neural Net Lab:

ADALINE Example
The files xor2.pat and or.des allow you to train ADALINE to do a Boolean OR classification using a bipolar discrete activation function. There are two elements to each input pattern and there are four patterns. There is one element per output pattern.
BPN Example
The files xor.pat and xor.des allow you to train a BPN net to do the XOR classification using a unipolar continuous activation function. The files xor2.pat and xor2.des are for the bipolar activation function. There are two elements to each input pattern and there are four patterns. There is one element per output pattern.
CPN and KOH Example
The files letters.pat and letters.des contain the information required to train a CPN or KOH net to recognize 3 letters (A, T, and E). The letters were constructed in a 5 x 5 matrix, so there are 25 elements per input pattern and 30 such patterns. There are three elements per output pattern. Please note that the letters files can also be used to train a BPN net as well.
BAM Example
The files bam.pat and bam.des provide a small data set for training a BAM network. There are 16 elements for each of the four input patterns and 7 elements for each output pattern.

Return to beginning


Running an Example

To give you an idea of how to run the Neural Net Lab, I'm going to walk you through one of the example nets. Specifically, we're going to train a BPN net to do the XOR classification.

First, click on Train on the menu bar (or press ALT-T). From the drop down menu select BPN. You'll be asked to provide some parameters. For now, use these values: max error = .001, eta = .15, lambda = 1.0, alpha = .5, bias = -1.0, inputs = 2, hidden = 2, outputs = 1, and patterns = 4. Select a unipolar activation function. Check that you have entered the values correctly, then select OK.

In the next dialog box, enter these filenames: Patterns = xor.pat, Desired = xor.des, Results = xor.res, and Weights = xor.net. Alternatively, you could click on the arrow next to a field and select the appropriate file from the drop-down list. Use the TAB key to move to the next field. Select OK to train the net.

While the net is training you'll be treated to an entertaining display of the current cycle number and error. If all goes well, the error value should drop each cycle. When the net is trained, you'll be asked if you want to test on another data set. For this example, answer "no." View or print the results at your discretion. That's all there is to it.

Return to beginning


Troubleshooting

So, your net won't train. What do you do? The answer depends on a lot of things. Hopefully I will have answered your question below. If not, or if you experience a technical problem with the program,contact me via the Internet at 74200.303@compuserve.com or on Compuserve at 74200,303.

The program says my file doesn't exist. Why?
Neural Net Lab expects files to be in its own directory. If they're not in the Neural Net Lab directory, you must specify the complete DOS pathname.
When I open my results file using the FILE:VIEW option, I can't see the whole file. What's up?
There's a 460 line maximum on the files that FILE:VIEW can present. This is a limitation of the tools I used in building the program. I hope to remove this limit in future releases.
I printed my results file and the results were really ugly. How do I fix it?
I assume you mean the formatting. At present, the print function is merely a text dump to the default printer port. To make your printouts prettier, try formatting and printing the file from your word processor.
Enough of the fluff. I tried to train a network on my data set. While I've been waiting for it to train I've gotten married, had 2 kids, 3 grandchildren, and a partridge in a pear tree. The net is still not done. What gives?
A lot of things play into neural net performance. In some ways, the study of networks is more art than science. Here are some possibilities to consider:

Return to beginning