home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!zephyr.ens.tek.com!uw-beaver!micro-heart-of-gold.mit.edu!news.bbn.com!usc!wupost!ukma!seismo!lll-winken!tazdevil!henrik
- From: henrik@mpci.llnl.gov (Henrik Klagges)
- Newsgroups: comp.ai.neural-nets
- Subject: Re: neural nets and generalization (was Why not trees?)
- Message-ID: <?.711993807@tazdevil>
- Date: 24 Jul 92 16:03:27 GMT
- References: <arms.711643374@spedden> <4458@rosie.NeXT.COM>
- Sender: usenet@lll-winken.LLNL.GOV
- Lines: 16
- Nntp-Posting-Host: tazdevil.llnl.gov
-
- paulking@next.com (Paul King) writes:
- >of events transform an input pattern into an output pattern. The
- >"goal" of the neural net is not only to memorize the input-to-output
- ^^^^^^^^
- >mappings,
-
- If the black box 'memorizes' the patterns (literally), you are lost,
- as lookup tables are pretty useless. I found that high information
- compression rates (rule of thumb: # of float invals/# of float weights)
- lead to good generalization.
-
- --
-
- Cheers, Henrik
- MPCI at LLNL
- IBM Research
-