home *** CD-ROM | disk | FTP | other *** search
- Comments: Gated by NETNEWS@AUVM.AMERICAN.EDU
- Path: sparky!uunet!paladin.american.edu!darwin.sura.net!spool.mu.edu!uwm.edu!psuvax1!psuvm!auvm!CCB.BBN.COM!BNEVIN
- Message-ID: <CSG-L%92111806370894@VMD.CSO.UIUC.EDU>
- Newsgroups: bit.listserv.csg-l
- Date: Wed, 18 Nov 1992 07:24:28 EST
- Sender: "Control Systems Group Network (CSGnet)" <CSG-L@UIUCVMD.BITNET>
- From: "Bruce E. Nevin" <bnevin@CCB.BBN.COM>
- Subject: analog NN chip
- Lines: 55
-
- ----BEGINNING OF FORWARDED MESSAGES----
- Received: from LABS-N.BBN.COM by CCB.BBN.COM ; 17 Nov 92 14:58:09 EST
- Received: from KARIBA.BBN.COM by LABS-N.BBN.COM id aa18082; 17 Nov 92 14:58 EST
- Received: by KARIBA.BBN.COM id aa17855; 17 Nov 92 14:52 EST
- To: dept47@BBN.COM, neural-people@BBN.COM, machine-learning@BBN.COM
- Subject: re: CDSP Seminar
- Date: Tue, 17 Nov 92 14:50:54 -0500
- Message-ID: <2133.722029854@bbn.com>
- From: Jeff Morrill <jmorrill@BBN.COM>
-
-
- Communications and Digital Signal Processing (CDSP)
- Center for Research and Graduate Studies
- CDSP Seminar
- Title: The Integrated Neurocomputing Architecture (INCA)
- Speaker: Mark Dzwonczyk, The C. S. Draper Laboratory
-
- Here are a few notes for those of you who wonder what you missed.
- There were two issues of interest here, one regarding analog hardware
- for neural nets and one regarding sonar signal processing.
-
- Apparently analog hardware implementations of neural nets are in
- disrepute because Intel tried and failed to make it work. JPL
- has tried again, however, this time successfully. This talk was
- about a proof-of-concept project to apply JPL's chip to several
- practical problems. The value of an analog implementation is that
- it takes 11 transistors per neuron, whereas a digital implementation
- takes more like 6000 transistors. Thus an analog implementation
- is better when miniaturization and low power requirements are important.
-
- What Draper did was put several of JPL's chips onto a VME board
- in a fixed configuration: a 4-layer feedforward neural net with
- 64 nodes per layer, or 256 nodes altogether. Training was done
- offline on a Sun Workstation, and the final connection weights
- were "ftp'd" to the hardware. To train they simply applied
- backprop using commercial-off-the-shelf software (NeuralWare).
- (They are now working on hardware-in-the-loop training.) When
- asked how long training took, he said simply, "three months of
- a graduate student's time."
-
- There were several applications that they used to "prove the concept,"
- and the one he talked most of was realtime sidescan sonar target detection.
- The data looks like distance vs. time map, with pixels colored according
- to the amplitude of the echo. Targets are white (loud) blips lost in a
- bunch of noise, and it can be hard for a person to see them. The algorithm
- was to slide a 10x10 window around the image and look for underwater mines.
- Using 2 hidden layers, they achieved a false alarm rate of .03%. It is
- impressive that this straightforward approach to the problem yielded such
- a high degree of accuracy. (Nevertheless, the navy wants to cut this false
- alarm rate by another factor of 10 before fielding it, because the
- penalty for being wrong about a mine is rather high.)
-
- jeff morrill
-
- ----END OF FORWARDED MESSAGES----
-