home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Monster Media 1993 #2
/
Image.iso
/
text
/
9305nni.zip
/
930514.DIS
< prev
next >
Wrap
Text File
|
1993-05-14
|
2KB
|
41 lines
Article 9111 of comp.ai.neural-nets:
Newsgroups: comp.ai.neural-nets
Path: serval!netnews.nwnet.net!usenet.coe.montana.edu!decwrl!concert!sas!mozart.unx.sas.com!saswss
From: saswss@unx.sas.com (Warren Sarle)
Subject: Re: Practical NN Recipes in C++ (Book)
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <C70zwo.3Kt@unx.sas.com>
Date: Fri, 14 May 1993 16:46:00 GMT
References: <1svn4g$733@risc1.rz.fh-heilbronn.de> <1svmprINNlbh@bHARs12c.bnr.co.uk>
Nntp-Posting-Host: hotellng.unx.sas.com
Organization: SAS Institute Inc.
Lines: 24
In article <1svmprINNlbh@bHARs12c.bnr.co.uk>, grs@bharh47.bnr.co.uk (G.Stoneley) writes:
|> I have been given a title of 'Practical Neural Nets in C++' by
|> Tim Masters but am not sure how much this relates to Back-prop.
|> If anyone has any info on this book or others I would be very
|> interested.
_Practical Neural Network Recipes in C++_, Academic Press, 1993.
It has code for ordinary backprop, conjugate gradients, simulated
annealing, and genetic algorithms. I don't know about the code, but
the book is excellent. It explains a variety of interesting things,
such as why overtraining is a myth (over_fitting_, of course, can
be a serious problem) and has lots of practical advice. I've skimmed
most of the book, and my only major complaints are that Masters is
excessively fond of singular value decompositions for linear least
squares and he doesn't go into Newton methods for optimization.
--
Warren S. Sarle SAS Institute Inc. The opinions expressed here
saswss@unx.sas.com SAS Campus Drive are mine and not necessarily
(919) 677-8000 Cary, NC 27513 those of SAS Institute.