home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!van-bc!mdavcr!garry
- From: garry@mdavcr.mda.ca (Gary Holmen)
- Newsgroups: comp.ai.neural-nets
- Subject: Re: Neural Nets and Brains
- Message-ID: <2905@mdavcr.mda.ca>
- Date: 27 Jul 92 23:37:12 GMT
- References: <BILL.92Jul23135614@ca3.nsma.arizona.edu> <arms.711935064@spedden> <50994@seismo.CSS.GOV>
- Organization: MacDonald Dettwiler, 13800 Commerce Parkway, Richmond, BC, Canada V6V 2J3
- Lines: 55
-
- In article <50994@seismo.CSS.GOV> black@seismo.CSS.GOV (Mike Black) writes:
- >In article <arms.711935064@spedden> arms@cs.UAlberta.CA (Bill Armstrong) writes:
- >>
- >>The theory of multilayer perceptrons as used in BP is horrifying. You
- >>need Kolmogorov's theorem just to show you can do everything you want
- >>to approximate continuous functions, but nobody can apply it in
- >>practice. In contrast, every logic designer who has heard of CNF and
- >>DNF finds it *obvious* that a adaptive logic net can synthesize any
- >>boolean function. So BP nets lose in a BIG way on the theory side, sorry.
- >>
- >I couldn't care less about theory...only results...
-
- This seems to be an 'end justifies the means' attitude. How can you assure
- your customers that you are producing *SAFE* software without looking at
- theory? One of the big concerns I hear consistently from customers is the
- neural networks just seem to be 'black boxes'. They want to minimize the risks
- they're involved in and a response like " But it worked last time...." just
- doesn't do it. I'm not down grading the importance of research into new fields
- but maybe we should also be concerned with explaining what we are doing at
- the present time. While results are great I would feel much more secure knowing
- that there is theory backing up what I'm doing.
-
- >I took the atree software and (for times sake) reduced the multiplication
- >problem to the 1 and 2 times tables. I removed 1*6 from the table and
- >let atree crank. When I tested 1*6 it gave me an answer of ~35. I do
- >NOT call this superior as the backprop net I trained gave me an answer
- >that was at least BETWEEN 1*5 and 1*7. The other problem I ran into
- >was running out of memory (16 meg + 64meg swap space) on a problem that
- >I had previously solved with backprop.
-
- After reading this message I decided to go home and run the ALN software
- on my IBM PC. It is only a 25 Hz 386 with 2 Meg of memory and the
- package seemed to work fine. I trained 3 trees on the times tables and
- it finished in about 45 mins. The answer I recieved was 1*7 which is about
- the same accuracy I've seen from the BP algorithms I've used.
-
- My guess the problem with your answer had to do with the
- random_walk portion of the algorithm rather than the ALN's themselves.
- I believe that this has been noted for modification in future releases.
-
-
- In conclusion I would like to note that even though ALN's do have some
- disadvantages (not enough of a user base being one of them) that they do
- show promise in several areas and I believe that we shouldn't 'throw the
- baby out with the bath water' just because it doesn't have the recognition
- that BP nets do. Look at them objectively and learn their advantages
- and disadvantages so that we can use neural networks to their full potential.
-
- ----------------------------------------------------------------------------
- - Garry Holmen | To Boldly Go Where No Computer Has Gone -
- - Software Engineer | Before -
- - MDA Richmond, BC | -
- ----------------------------------------------------------------------------
- - email: garry @mda.ca -
- ----------------------------------------------------------------------------
-