home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!engcon!fenner
- From: fenner@engcon.marshall.ltv.com (JWFENNER)
- Newsgroups: comp.ai.neural-nets
- Subject: Re: Dynamically adding nodes, layers ?
- Message-ID: <991@engcon.marshall.ltv.com>
- Date: 12 Aug 92 19:29:07 GMT
- References: <1992Jul31.132715.2858@cs.hw.ac.uk>
- Reply-To: fenner@engcon.UUCP (JWFENNER)
- Organization: LTV MEG, Dallas, TX
- Lines: 33
-
- In article <1992Jul31.132715.2858@cs.hw.ac.uk> writes:
- >I hope someone can help me here, I seem to recall reading somewhere
- >about a learning algorithm which first checks to see if the new pattern
- >can be learnt without overwriting old info, and if not, then adding new
- >nodes as required. I *DEFINITELY* read about this somewhere, unfortunately
- >I dismissed this as being of little import to what I am trying to do.
- >However, the world being as it is, this irrelevance has become relevant.
- >So, if anyone can help me I would be most appreciative.
-
- There are two networks that come to mind. The first one is known as the
- Hyperspherical Attractor Network (HAN), developed by Robert Kennedy of
- the Naval Coastal Systems Center Research and Technology Department. This
- network adds & deletes nodes as necessary to learn. It is a good pattern
- classifier network. Reference:
-
- The Hyperspherical Attractor Network: Concepts, Theory, and Application
- Robert A. Kennedy
- December 1990
- Naval Coastal Systems Center Research and Technology Department
- Panama City, FL
-
- The second one I know of is called GMDH (Group Method of Data Handling). I
- don't know much about it, other than the fact that it can add layers when
- training. I do know the book I saw it in was called NEUROCOMPUTING by
- Robert Hecht-Nielsen.
-
- Hope this is helpful.
-
- --
- ==================================================================
- = Jon Fenner (engcon!fenner@uunet.uu.net) =
- = LTV Aerospace and Defense - Neural Network / AI Research Group =
- ==================================================================
-