home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.ai.neural-nets
- Path: sparky!uunet!boulder!boulder!eesnyder
- From: eesnyder@boulder.Colorado.EDU (Eric E. Snyder)
- Subject: Weight update function: input -> hidden layer
- Message-ID: <eesnyder.726867596@beagle>
- Sender: news@colorado.edu (The Daily Planet)
- Nntp-Posting-Host: beagle.colorado.edu
- Organization: University of Colorado, Boulder
- Date: 12 Jan 93 19:39:56 GMT
- Lines: 36
-
- Ok, I have been pulling my hair out for a couple of days now...
-
- I am trying to write a really simple feed forward back prop net and
- it just isn't working properly. Everything seems pretty straight
- forward except for the function that calculates the change in weights
- for the weights connecting the inputs to the (single) hidden layer.
-
- Starting from the formulas in Hertz, Krogh and Palmer, 6.9 and 6.10,
- I get the following update rule:
-
-
- DeltaWeights[i][j] += Epsilon * Units[j] * (1 - Units[j]) * Weights[i][j] *
- Output * (1.0 - Output) * (Target[vector_number] - Output) *
- TrainingInputs[vector_number][i];
-
-
- Where Units[j] = the activation of hidden unit j
- Weights[i][j] = weight connecting input i to hidden unit j
- Output = value of the (single) output unit
- Target = target output
- TrainingInputs[i] = input i
-
-
- Am I screwing this up? Any comments or suggestions would be greatly
- appreciated.
-
- Many, many thanks...
-
- ---------------------------------------------------------------------------
- TTGATTGCTAAACACTGGGCGGCGAATCAGGGTTGGGATCTGAACAAAGACGGTCAGATTCAGTTCGTACTGCTG
- Eric E. Snyder
- Department of MCD Biology ...making feet for childrens' shoes.
- University of Colorado, Boulder
- Boulder, Colorado 80309-0347
- LeuIleAlaLysHisTrpAlaAlaAsnGlnGlyTrpAspLeuAsnLysAspGlyGlnIleGlnPheValLeuLeu
- ---------------------------------------------------------------------------
-