home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.ai.neural-nets
- Path: sparky!uunet!rosevax!flower!scotbri
- From: scotbri@flower.rosemount.com (Scott Brigham)
- Subject: Matlab program for Backpropagation
- Message-ID: <1992Sep8.150748.22368@rosevax.rosemount.com>
- Originator: scotbri@flower
- Sender: news@rosevax.rosemount.com (USENET News administrator)
- Nntp-Posting-Host: flower
- Organization: Rosemount, Inc.
- Date: Tue, 8 Sep 1992 15:07:48 GMT
- Lines: 156
-
- Some folks were interested in a Matlab program for backprop. Here it is.
-
- ------------------------------------------------------------------------------
-
- NOTES ON USING bp2.m
-
- This is a matlab program to train a feedforward network with one hidden layer.
- It uses the the standard backprop algorithm with momentum.
- The input vector file and output vector file should be in ascii, with
- each vector being on a separate line (see the matlab user's manual for
- a more complete description of matlab ascii data files). Obviously, the
- number of vectors in the input file should be the same as the number in the
- output file.
-
- The output range of a neuron is -1 to +1 and it uses the hyperbolic tangent
- squashing function.
-
- When the program asks for a file name you enter it without the suffix.
- For example, if your input vector file is called invec1.dat you would do this:
-
- Enter filename of input vectors: invec1
-
- The output error tolerance should be entered in absolute error value; for
- example if you want all training vectors to be within .018 of the target
- values (this would represent 1.0 % of the range if you use +/- .9 for your
- useable range) you would enter .018 when asked. This is what determines
- when the program stops. If you want it to do something different then
- by all means get in there and hack!
-
- This is a pretty bare-bones program. It's a good starting point for you
- to customize it for your own needs, add cool graphing stuff, etc.
- You'll have to write your own version for actually running the
- network after it's trained; it's pretty easy to modify a copy of this
- program to do that -- just delete the learning section and clean up
- the loose ends.
-
- If you want linear neurons in the output layer delete the tanh() function
- from the out2 expression and delete the (1 - out2) .* (1 + out2) from the
- delta2 expression.
-
- Scott Brigham
- Rosemount, Inc.
- 12001 Technology Drive
- Eden Prairie, MN 55344
- scotbri@rosemount.com
-
-
- ---------------------------- cut here --------------------------------------
- %
- % bp2 -- backpropagation training algorithm for a
- % two-layer neural network.
- %
- % written by Scott Brigham
- % copyright 1991, 1992 by Rosemount, Inc.
- %
- % 27 Feb 91 -- created
- % 5 Mar 91 -- added trainable bias
- % 8 Mar 91 -- added error tracking
- % 14 Mar 91 -- added momentum
- % 1 Apr 91 -- changed output range to +/- 1
- % 17 Apr 91 -- changed from sigmoid to tanh outputs
- % 19 Apr 91 -- added load matrices option
- % 29 Aug 91 -- added K factor to change limits of tanh function
- % 21 Feb 92 -- replace 'for' loop with vectorized version
- %
- fprintf('\n\nTwo-layer Backprop Network Creator and Trainer\n')
- fprintf('Version 2.0 written by Scott Brigham\n\n')
- fprintf('Copyright 1991,1992 by Rosemount, Inc.\n')
- rand('uniform');
- rrange = .5; % range for random weight initialization
- epoch_disp = 100;
-
- mc = menu('Initialization Method','Manual','Execute M-file');
-
- if mc == 2
-
- initfile = input('Enter M-file name: ','s');
- eval(initfile);
-
- elseif mc == 1
-
- vecfile = input('Enter filename of input vectors: ','s');
- targfile = input('Enter filename of target vectors: ','s');
- eval(['load ',vecfile,'.dat'])
- eval(['load ',targfile,'.dat'])
- eval(['in=',vecfile,';']);
- eval(['target=',targfile,';']);
- eval(['clear ',vecfile,' ',targfile])
- [nov, b] = size(in);
- [junk,d] = size(target);
-
- fprintf('\nThis network has %.0f inputs and %.0f output neurons.\n\n' ...
- ,b,d);
-
- mc = menu('Choice for Starting Weights','Random','Load from file');
- if mc == 1
- c = input('How many neurons in the hidden layer? ');
- W1 = rrange * 2 * (rand(b+1,c) - .5);
- W2 = rrange * 2 * (rand(c+1,d) - .5);
- elseif mc == 2
- wfile = input('Enter name of weight file: ','s');
- eval(['load ',wfile])
- [junk,c] = size(W1);
- fprintf('This network has %0.f neurons in the hidden layer.\n',c);
- end
- tolerance = input('Output error tolerance: ');
- eta = input('Learning rate: ');
- alpha = input('Momentum (0 - .95): ');
- weightfile = input('Filename to save weight matrices in: ','s');
-
- end
-
- dW1old = zeros(W1);
- dW2old = zeros(W2);
- bias = ones(nov,1);
- error = zeros(target);
- done = 0;
- epoch = 0;
- n = 1;
- mse=zeros(20,1);
- while done == 0
- out1 = tanh([in bias] * W1);
- out2 = tanh([out1 bias] * W2);
- %
- % output layer
- %
- delta2 = (target - out2) .* (1 - out2) .* (1 + out2);
- error = target - out2;
- dW2 = eta * [out1 bias]' * delta2 + alpha * dW2old;
- dW2old = dW2;
- W2 = W2 + dW2;
- %
- % input layer
- %
- delta1 = delta2 * W2(1:c,:)' .* (1 - out1) .* (1 + out1);
- dW1 = eta * [in bias]' * delta1 + alpha * dW1old;
- dW1old =dW1;
- W1 = W1 + dW1;
-
- if max(max(abs(error))) < tolerance
- done = 1;
- end
-
- epoch = epoch + 1;
- if rem(epoch,epoch_disp) == 0
- mse(n) = mean(sum(error .* error));
- fprintf('epoch %.0f: mse = %.5f rmse = %.3f\n',epoch,mse(n),sqrt(mse(n)))
- n = n + 1;
- end
- end
- eval(['save ',weightfile,' W1 W2'])
- fprintf('The weights converged! How lucky can you get?\n')
- epoch
- %
- % the end
- %
-