home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Monster Media 1993 #2
/
Image.iso
/
text
/
9305nni.zip
/
930512.PPR
< prev
next >
Wrap
Internet Message Format
|
1993-05-12
|
34KB
From ml-connectionists-request@q.cs.cmu.edu Tue May 4 16:05:50 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA29187; Tue, 4 May 93 01:18:34 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id ac09675; 3 May 93 17:57:28 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id ac09672;
3 May 93 17:31:41 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa10115;
3 May 93 17:31:03 EDT
Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa13044; 3 May 93 15:19:14 EDT
Received: from edison.ee.washington.edu by EDRC.CMU.EDU id aa10632;
3 May 93 15:18:20 EDT
Received: from pierce.ee.washington.edu. (pierce.ee.washington.edu) by edison.ee.washington.edu
(5.61/UW-NDC Revision: 2.6 ) id AA02751; Mon, 3 May 93 12:28:57 -0700
Received: by pierce.ee.washington.edu. (4.0/SMI-4.0)
id AA22673; Mon, 3 May 93 12:18:07 PDT
Date: Mon, 3 May 93 12:18:07 PDT
From: Jenq-Neng Hwang <hwang@pierce.ee.washington.edu>
Message-Id: <9305031918.AA22673@pierce.ee.washington.edu.>
To: Connectionists@cs.cmu.edu
Subject: Markov random field modeling via neural networks
Status: RO
Technical Report available from neuroprose:
TEXTURED IMAGE SYNTHESIS AND SEGMENTATION VIA NEURAL NETWORK
PROBABILISTIC MODELING
Jenq-Neng Hwang, Eric Tsung-Yen Chen
Information Processing Laboratory
Department of Electrical Engineering, FT-10
University of Washington, Seattle, WA 98195
ABSTRACT
It has been shown that a trained back-propagation neural network
(BPNN) classifier with Kullback-Leibler criterion produces outputs
which can be interpreted as estimates of Bayesian "a posteriori"
probabilities. Based on this interpretation, we propose a
back-propagation neural network (BPNN) approach for the estimation
of the local conditional distributions of textured images, which are
commonly represented by a Markov random field (MRF) formulation.
The proposed BPNN approach overcomes many of the difficulties
encountered in using MRF formulation. In particular our approach
does not require the trial-and-error selection of clique functions or
the subsequent laborious and unreliable estimation of clique
parameters. Simulations show that the images synthesized using BPNN
modeling produced desired artificial/real textures more consistently
than MRF based methods. Application of the proposed BPNN approach to
segmentation of artificial and real-world textures is also presented.
================
To obtain copies of the postscript file, please use Jordan Pollack's service
(no hardcopies will be provided):
Example:
unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52)
Name (archive.cis.ohio-state.edu): anonymous
Password (archive.cis.ohio-state.edu:anonymous): <ret>
ftp> cd pub/neuroprose
ftp> binary
ftp> get hwang.nnmrf.ps.Z
ftp> quit
unix> uncompress hwang.nnmrf.ps
Now print "hwang.nnmrf.ps" as you would any other (postscript) file.
From ml-connectionists-request@q.cs.cmu.edu Tue May 4 16:06:11 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA18644; Mon, 3 May 93 19:06:34 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id ab09675; 3 May 93 17:54:14 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id ab09672;
3 May 93 17:31:35 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa10111;
3 May 93 17:30:56 EDT
Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa13004; 3 May 93 15:17:33 EDT
Received: from edison.ee.washington.edu by EDRC.CMU.EDU id aa10629;
3 May 93 15:17:17 EDT
Received: from pierce.ee.washington.edu. (pierce.ee.washington.edu) by edison.ee.washington.edu
(5.61/UW-NDC Revision: 2.6 ) id AA02725; Mon, 3 May 93 12:27:54 -0700
Received: by pierce.ee.washington.edu. (4.0/SMI-4.0)
id AA22668; Mon, 3 May 93 12:17:00 PDT
Date: Mon, 3 May 93 12:17:00 PDT
From: Jenq-Neng Hwang <hwang@pierce.ee.washington.edu>
Message-Id: <9305031917.AA22668@pierce.ee.washington.edu.>
To: Connectionists@cs.cmu.edu
Subject: back-propagation and projection pursuit learning
Status: RO
Technical Report available from neuroprose:
REGRESSION MODELING IN BACK-PROPAGATION AND PROJECTION PURSUIT LEARNING
Jenq-Neng Hwang, Shyh-Rong Lay
Information Processing Laboratory
Department of Electrical Engineering, FT-10
University of Washington, Seattle, WA 98195
and
Martin Maechler, Doug Martin, Jim Schimert
Department of Statistics, GN-22
University of Washington, Seattle, WA 98195
ABSTRACT
We studied and compared two types of connectionist learning methods
for model-free regression problems in this paper. One is the popular
"back-propagation" learning (BPL) well known in the artificial
neural networks literature; the other is the "projection
pursuit" learning (PPL) emerged in recent years in the statistical
estimation literature. Both the BPL and the PPL are based on
projections of the data in directions determined from interconnection
weights. However, unlike the use of fixed nonlinear activations
(usually sigmoidal) for the hidden neurons in BPL, the PPL
systematically approximates the unknown nonlinear activations.
Moreover, the BPL estimates all the weights simultaneously at each
iteration, while the PPL estimates the weights cyclically
(neuron-by-neuron and layer-by-layer) at each iteration. Although the
BPL and the PPL have comparable training speed when based on a
Gauss-Newton optimization algorithm, the PPL proves more parsimonious
in that the PPL requires a fewer hidden neurons to approximate the
true function. To further improve the statistical performance of the
PPL, an orthogonal polynomial approximation is used in place of the
supersmoother method originally proposed for nonlinear activation
approximation in the PPL.
================
To obtain copies of the postscript file, please use Jordan Pollack's service
(no hardcopies will be provided):
Example:
unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52)
Name (archive.cis.ohio-state.edu): anonymous
Password (archive.cis.ohio-state.edu:anonymous): <ret>
ftp> cd pub/neuroprose
ftp> binary
ftp> get hwang.bplppl.ps.Z
ftp> quit
unix> uncompress hwang.bplppl.ps
Now print "hwang.bplppl.ps" as you would any other (postscript) file.
From ml-connectionists-request@q.cs.cmu.edu Tue May 4 17:45:46 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA01680; Tue, 4 May 93 19:45:29 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id aa12833; 4 May 93 14:14:25 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id aa12807;
4 May 93 13:37:31 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa10963;
4 May 93 13:36:48 EDT
Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa24982; 4 May 93 13:07:33 EDT
Received: from edison.ee.washington.edu by EDRC.CMU.EDU id aa13646;
4 May 93 13:06:19 EDT
Received: from pierce.ee.washington.edu. (pierce.ee.washington.edu) by edison.ee.washington.edu
(5.61/UW-NDC Revision: 2.6 ) id AA09469; Tue, 4 May 93 10:16:56 -0700
Received: by pierce.ee.washington.edu. (4.0/SMI-4.0)
id AA24985; Tue, 4 May 93 10:06:05 PDT
Date: Tue, 4 May 93 10:06:05 PDT
From: Jenq-Neng Hwang <hwang@pierce.ee.washington.edu>
Message-Id: <9305041706.AA24985@pierce.ee.washington.edu.>
To: Connectionists@cs.cmu.edu
Subject: apology
Status: RO
We apologized for our ignorance of the incompatibility of our postscript
files recently placed in Neuroprose with your printers. We will fix
these problems and reload these three reports ASAP.
These three files are:
hwang.bplppl.ps.Z (back-propagation and projection pursuit learning)
hwang.nnmrf.ps.Z (probabilistic textured image modeling by neural networks)
hwang.srnn.ps.Z (mental image transformation via surface reconstruction nn)
Jenq-Neng Hwang, Assistant Professor
Information Processing Laboratory
Dept. of Electrical Engr., FT-10
University of Washington
Seattle, WA 98915
(206) 685-1603 (O), (206) 543-3842 (FAX)
hwang@ee.washington.edu
From ml-connectionists-request@q.cs.cmu.edu Tue May 4 19:52:36 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA05095; Tue, 4 May 93 21:52:29 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id ab12833; 4 May 93 14:17:34 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id ab12807;
4 May 93 13:37:43 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa10968;
4 May 93 13:37:07 EDT
Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa24431; 4 May 93 12:17:38 EDT
Received: from edison.ee.washington.edu by EDRC.CMU.EDU id aa13432;
4 May 93 12:16:45 EDT
Received: from pierce.ee.washington.edu. (pierce.ee.washington.edu) by edison.ee.washington.edu
(5.61/UW-NDC Revision: 2.6 ) id AA09109; Tue, 4 May 93 09:26:48 -0700
Received: by pierce.ee.washington.edu. (4.0/SMI-4.0)
id AA24719; Tue, 4 May 93 09:15:53 PDT
Date: Tue, 4 May 93 09:15:53 PDT
From: Jenq-Neng Hwang <hwang@pierce.ee.washington.edu>
Message-Id: <9305041615.AA24719@pierce.ee.washington.edu.>
To: Connectionists@cs.cmu.edu
Subject: mental image transformation and surface reconstruction NN
Status: RO
Technical Report available from neuroprose:
MENTAL IMAGE TRANSFORMATION AND MATCHING USING
SURFACE RECONSTRUCTION NEURAL NETWORKS
Jenq-Neng Hwang, Yen-Hao Tseng
Information Processing Laboratory
Department of Electrical Engineering, FT-10
University of Washington, Seattle, WA 98195
ABSTRACT
Invariant 2-D/3-D object recognition and motion estimation
under detection/occlusion noise and/or partial object viewing
are difficult pattern recognition tasks. On the other hand, the
biological neural networks of human are extremely adept in these
tasks. It has been suggested by the studies of experimental
psychology that the task of matching rotated and scaled shapes by
human is done by mentally rotating and scaling gradually one of the
shapes into the orientation and size of the other and then testing
for a match. Motivated by these studies, we present a novel and
robust neural network solution for these tasks based on detected
surface boundary data or range data. The method operates in two
stages: The object is first parametrically represented by a surface
reconstruction neural network (SRNN) trained by the boundary points
sampled from the exemplar object. When later presented with boundary
points sampled from the distorted object without point correspondence,
this parametric representation allows the mismatch information
back-propagate through the SRNN to gradually determine (align) the
best similarity transform of the distorted object. The distance
measure can then be computed in the reconstructed representation
domain between the surface reconstructed exemplar object and the
aligned distorted object. Applications to invariant 2-D target
classification and 3-D object motion estimation using sparse range
data collected from a single aspect view are presented.
================
To obtain copies of the postscript file, please use Jordan Pollack's service
(no hardcopies will be provided):
Example:
unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52)
Name (archive.cis.ohio-state.edu): anonymous
Password (archive.cis.ohio-state.edu:anonymous): <ret>
ftp> cd pub/neuroprose
ftp> binary
ftp> get hwang.srnn.ps.Z
ftp> quit
unix> uncompress hwang.srnn.ps
Now print "hwang.srnn.ps" as you would any other (postscript) file.
From ml-connectionists-request@q.cs.cmu.edu Wed May 5 13:39:46 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA04192; Wed, 5 May 93 15:39:37 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id ab17419; 5 May 93 15:57:55 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id ab17417;
5 May 93 15:30:40 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa11802;
5 May 93 15:30:07 EDT
Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa07662; 5 May 93 12:51:35 EDT
Received: from edison.ee.washington.edu by EDRC.CMU.EDU id aa17684;
5 May 93 12:50:26 EDT
Received: from pierce.ee.washington.edu. (pierce.ee.washington.edu) by edison.ee.washington.edu
(5.61/UW-NDC Revision: 2.6 ) id AA16171; Wed, 5 May 93 10:01:05 -0700
Received: by pierce.ee.washington.edu. (4.0/SMI-4.0)
id AA28307; Wed, 5 May 93 09:50:13 PDT
Date: Wed, 5 May 93 09:50:13 PDT
From: Jenq-Neng Hwang <hwang@pierce.ee.washington.edu>
Message-Id: <9305051650.AA28307@pierce.ee.washington.edu.>
To: Connectionists@cs.cmu.edu
Subject: three technical reports available
Status: RO
We have fixed the postscript printing
problems and reload the three reports in neuroprose.
These three files are now available:
hwang.bplppl.ps.Z (back-propagation and projection pursuit learning)
hwang.nnmrf.ps.Z (probabilistic textured image modeling by neural networks)
hwang.objrec.ps.Z (single spaced) or hwang.srnn.ps.Z (double spaced)
(mental image transformation via surface reconstruction neural nets)
Jenq-Neng Hwang, Assistant Professor
Information Processing Laboratory
Dept. of Electrical Engr., FT-10
University of Washington
Seattle, WA 98915
(206) 685-1603 (O), (206) 543-3842 (FAX)
hwang@ee.washington.edu
From ml-connectionists-request@q.cs.cmu.edu Fri May 7 18:35:10 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA28841; Fri, 7 May 93 20:35:00 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id ab26005; 7 May 93 18:54:01 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id aa25957;
7 May 93 18:09:27 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa13365;
7 May 93 18:08:08 EDT
From: Bernd Fritzke <fritzke@icsi.berkeley.edu>
Message-Id: <9305072143.AA22277@icsib14.ICSI.Berkeley.EDU>
Subject: three new papers in neuroprose
To: Connectionists@cs.cmu.edu
Date: Fri, 7 May 93 14:43:31 PDT
Cc: Bernd Fritzke <fritzke@icsi.berkeley.edu>
Ftp-Host: archive.cis.ohio-state.edu
Ftp-Filename: /pub/neuroprose/fritzke.tr93-26.ps.Z
Ftp-Filename: /pub/neuroprose/fritzke.icann93.ps.Z
Ftp-Filename: /pub/neuroprose/fritzke.nips92.ps.Z
Status: RO
*** DO NOT FORWARD TO ANY OTHER LISTS ***
The following technical reports have been placed in the
neuroprose directory (ftp instructions follow the abstracts).
For two of the TR's also hardcopies are available.
Instructions are at the end of the posting.
Comments and questions are welcome.
Thanks to Jordan Pollack for maintaining the neuroprose archive.
-Bernd
International Computer Science Institute
1947 Center Street, Suite 600
Berkeley, CA 94704-1105
USA
------------------------------------------------------------
Growing Cell Structures -
A Self-organizing Network for Unsupervised
and Supervised Learning *)
Bernd Fritzke
ICSI, Berkeley
TR-93-026
(34 pages)
*) submitted for publication
We present a new self-organizing neural network
model having two variants. The first variant performs unsu-
pervised learning and can be used for data visualization,
clustering, and vector quantization. The main advantage
over existing approaches, e.g., the Kohonen feature map,
is the ability of the model to automatically find a suit-
able network structure and size. This is achieved through a
controlled growth process which also includes occasional
removal of units.
The second variant of the model is a supervised
learning method which results from the combination of the
abovementioned self-organizing network with the radial basis
function (RBF) approach. In this model it is possible - in
contrast to earlier approaches - to perform the positioning
of the RBF units and the supervised training of the weights
in parallel. Therefore, the current classification error can
be used to determine where to insert new RBF units. This
leads to small networks which generalize very well. Results
on the two-spirals benchmark and a vowel classification
problem are presented which are better than any results
previously published.
------------------------------------------------------------
Vector Quantization with a Growing and
Splitting Elastic Net *)
Bernd Fritzke
ICSI, Berkeley
(6 pages)
*) to be presented at ICANN-93, Amsterdam
A new vector quantization method is proposed which gen-
erates codebooks incrementally. New vectors are inserted in
areas of the input vector space where the quantization error
is especially high until the desired number of codebook vec-
tors is reached. A one-dimensional topological neighborhood
makes it possible to interpolate new vectors from existing
ones. Vectors not contributing to error minimization are
removed. After the desired number of vectors is reached, a
stochastic approximation phase fine tunes the codebook. The
final quality of the codebooks is exceptional. A comparison
with two well-known methods for vector quantization was per-
formed by solving an image compression problem. The results
indicate that the new method is significantly better than
both other approaches.
------------------------------------------------------------
Kohonen Feature Maps and Growing Cell Structures --
a Performance Comparison *)
Bernd Fritzke
ICSI, Berkeley
(8 pages)
*) to appear in Advances in Neural Information Processing
Systems 5 C.L. Giles, S.J. Hanson, and J.D. Cowan (eds.),
Morgan Kaufmann, San Mateo, CA, 1993
A performance comparison of two self-organizing net-
works, the Kohonen Feature Map and the recently proposed
Growing Cell Structures is made. For this purpose several
performance criteria for self-organizing networks are pro-
posed and motivated. The models are tested with three exam-
ple problems of increasing difficulty. The Kohonen Feature
Map demonstrates slightly superior results only for the sim-
plest problem. For the other more difficult and also more
realistic problems the Growing Cell Structures exhibit sig-
nificantly better performance by every criterion. Addi-
tional advantages of the new model are that all parameters
are constant over time and that size as well as structure of
the network are determined automatically.
************************* ftp instructions **********************
If you have the Getps script
unix> Getps fritzke.tr93-26.ps.Z
unix> Getps fritzke.icann93.ps.Z
unix> Getps fritzke.nips92.ps.Z
(Getps ftp's the named file, decompresses it, and asks wether to
print it)
otherwise do first the following (to get Getps)
unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52)
Connected to archive.cis.ohio-state.edu.
220 archive.cis.ohio-state.edu FTP server ready.
Name: anonymous
331 Guest login ok, send ident as password.
Password:<type your email address here>
230 Guest login ok, access restrictions apply.
ftp> cd pub/neuroprose
250 CWD command successful.
ftp> get Getps
200 PORT command successful.
150 Opening BINARY mode data connection for Getps (2190 bytes).
226 Transfer complete.
ftp> quit
221 Goodbye.
************************* hardcopies ****************************
The NIPS92 paper and the 34-page paper have appeared as ICSI
technical reports TR-93-025 and TR-93-026, respectively.
Hardcopies are available for a small charge for postage and
handling.
For details please contact Vivian Balis (balis@icsi.berkeley.edu)
at ICSI.
From <@utarlvm1.uta.edu:INNS-L@UMDD.BITNET> Tue May 11 14:57:19 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA28142; Tue, 11 May 93 16:57:05 -0500
Message-Id: <9305112157.AA28142@cse.uta.edu>
Received: from UTARLVM1.UTA.EDU by UTARLVM1.UTA.EDU (IBM VM SMTP V2R2)
with BSMTP id 9569; Tue, 11 May 93 16:58:38 CDT
Received: from UTARLVM1.UTA.EDU (NJE origin LISTSERV@UTARLVM1) by
UTARLVM1.UTA.EDU (LMail V1.1d/1.7f) with BSMTP id 9568; Tue,
11 May 1993 16:58:37 -0500
Date: Tue, 11 May 1993 15:10:24 EDT
Reply-To: International Neural Network Society <INNS-L%UMDD.bitnet@utarlvm1.uta.edu>
Sender: International Neural Network Society <INNS-L%UMDD.bitnet@utarlvm1.uta.edu>
From: Morgan Downey <70712.3265@compuserve.com>
Subject: Re: General Information about Neural Networks
To: Multiple recipients of list INNS-L <INNS-L%UMDD.bitnet@utarlvm1.uta.edu>
Status: RO
-----------------
ARTIFICIAL NEURAL NETWORKS IN MEDICINE AND BIOLOGY
Center for Biomedical Informatics
State University of Campinas, Campinas - Brazil
Abstracts of published work by the Center
Status of Aug 15 1992
-----------------------------------------------------------
A HIGH-LEVEL LANGUAGE AND MICROCOMPUTER PROGRAM FOR
THE DESCRIPTION AND SIMULATION OF NEURAL ARCHITECTURES
Sabbatini, RME and Arruda-Botelho, AG
Center of Biomedical Informatics, Neurosciences
Applications Group, State University of Campinas, Campinas,
SP, BRAZIL)
The description, representation and simulation of complex
neural network structures by means of computers is an
essential step in the investigation of model systems and
inventions in the growing field of biological information
processing and neurocomputing. The handcrafting of neural
net architectures, however, is a long, tedious, difficult
and error-prone process, which can be substituted
satisfactorily by the neural network analogue of a computer
program or formal symbolic language. Several attempts have
been made to develop and apply such languages: P3, Hecht-
Nielsen's AXON, and Rochester's ISCON are some recent
examples.
We present here a new tool for the formal description and
simulation of artificial neural tissues in microcomputers.
It is a network editor and simulator, called NEUROED, as
well as a compiler for NEUROL, a high-level symbolic,
structured language which allows the definition of the
following elements of a neural tissue: a) elementary neural
architectonic units: each unit has the same number of cells
and the same internal interconnecting pattern and cell
functional parameters; b) elementary cell types: each cell
can be defined in terms of its basic functional parameters;
synoptic interconnections inside an architectonic unit
(axonic delay, weights and signal can be defined for each);
a cell can fan out to several others, with the same synoptic
properties; c) synaptic interconnections among units; d)
cell types and architectonic units can be replicated
automatically across neural tissue and interconnected; e)
cell types and architectonic units can be named and arranged
in hierarchical frames (parameter inheritance).
NEUROED's underlying model processing element (PE) is a
simplified Hodgkin-Huxley neuron, with RC-model, temporal-
summation, passive electrotonic potentials at dendritic
level, and a step transfer function with threshold level, a
fixed-size, fixed-duration, fixed-form spike, and an
absolute refractory period. Inputs Iij (i=1...NI) synapses
for j-th neuron are weighted with Wij (i=1...NI), where Wij
0 is defined for a inhibitory synapse, Wij = 0 for an
inactive or non-existent synapse and Wij 0 for an ex-
citatory synapse. Outputs Okj (k=1...NO) can have axonic
propagation delays Dkj (a delay can be equal to zero).
Firing of neurons in a network follows diffusion process,
according to propagation delays; random fluctuations in
several processes can be simulated. Several learning
algorithms can be implemented explicitly with NEUROL; a
Hebbian synapse-strength reinforcement rule has specific
language support now.
NEUROED's basic specifications are: a) written in Turbo
BASIC 1.0 for IBM-PC compatible machines, with CGA
monochrome graphics display and optional numerical
coprocessor; b) capacity of 100 neurons and 10.000 synapses;
c) three neural tissue layers: input, processing and output.
d) real-time simulation of neural tissue dynamics, with
three display modes: oscilloscope mode (displays membrane
potentials along time for several cells simultaneously); map
mode (displays bidimensional architecture with individual
cells, showing when they fire) and Hinton diagram (displays
interconnecting matrix with individual synapses, showing
when they fire); e) Realtime, interactive modification of
net parameters; and f) capability for building procedures,
functions and model libraries, which reside as external disk
files. NEUROED and NEUROL are easy to learn and to use,
intuitive for neuroscientists, and lend themselves to
modeling neural tissue dynamics for teaching purposes. We
are currently developing a basic "library" of NEUROED models
to teach basic neurophysiology to medical students.
Implementations of NEUROED and for parallel hardware are
also under way.
(Presented at the Fourth Annual Meeting of the Brazilian
Federation of Biological Societies, Caxambu, MG, July 1991)
-----------------------------------------------------------
A CASCADED NEURAL NETWORK MODEL FOR PROCESSING
2D TOMOGRAPHIC BRAIN IMAGES
Dourado SC and Sabbatini RME
Center for Biomedical Informatics, State
University of Campinas, P.O. Box 6005, 13081 Campinas,
So Paulo, Brazil.
Artificial neural networks (ANN) have demonstrated many
advantages and capabilities in applications involving the
processing of biomedical images and signals. Particularly in
the field of medical image processing, ANNs have been used
in several ways, such as in image filtering, scatter
correction, edge detection, segmentation, pattern and
texture classification, image reconstruction and alignment,
etc. The adaptive nature of ANNs (i.e., they are capable of
learning) and the possibility of implementing its function
using truly massive parallel processors and neural
integrated circuits, in the future; are strong arguments in
favor of investigating new architectures, algorithms and
applications for ANNs in Medicine.
In the present work, we are interested into designing a
prototype ANN which could be capable of processing serial
sections of the brain, obtained from CT or MRI tomographs.
The segmented, outlined images, representing internal brain
structures, both normal and abnormal, would then be used as
an input to a three-dimensional stereotaxic radiosurgery
planning software.
The ANN-based algorithm we have devised was initially
implemented as a software simulation in a microcomputer (PC
80386, with VGA color graphics and a 80387 mathematical
coprocessor). It is structured as a compound ANN, comprised
by three cascading sub-networks. The first one receives the
original digitized image, and is a one-layer, fully
interconnected ANN, with one processing element (PE) per
image pixel. The brain image is obtained from a General
Electric CT system, with 256 x 256 pixels and 256 gray
levels. The first ANN implements a MHF lateral inhibition
function, based on a convolution filter of variable
dimension (3 x 3 up to 9 x 9 PE's), and it is used to
iteratively enhance borders in the image. The PE
interconnection (i.e. convolution) function can be defined
by the user as a disk file containing a set of synaptic
weights, which is read by the program; thus allowing for
experimentation with different sets of coefficients and
sizes of the convolution window. In this layer, PE's have
synaptic weights varying from -1 to 1, and the step function
as its transfer function. Usually after 2 to 3 iterations,
the borders are completely formed and do not vary any more,
but are too thick (i.e., the trace width spans several
pixels). In order to thin out the borders, the output of the
MHF ANN layer is subsequently fed into a three-layer
perceptron, which was trained off-line using the
backpropagation algorithm to perform thinning on smaller
straight line segments. Finally, the thinned out image
obtained pixel-wise at the this ANN's output is fed into a
third network, also a three-layer perceptron trained off-
line using the backpropagation algorithm to complete small
gaps ocurring in the image contours. The final image, also
256 x 256 pixels with 2 levels of gray, is passed to the 3D
slice reconstruction program, implemented with conventional,
sequential algorithms. A fourth ANN perceptron previously
trained by back-propagation to recognize the gray histogram
signature of small groups of pixels in the original image
(such as bone, liquor, gray and white matter, blood, dense
tumor areas, etc.), is used to false-color the entire image
according to the classified thematic regions.
The cascaded, multilayer ANN thus implemented performs very
well in the overall task of obtaining automatically outlined
and segmented brain slices, for the purposes of 3D
reconstruction and surgical planning. Due to the complexity
of algorithms and to the size of the image, the time spent
by the computer we use is inordinately large, preventing a
practical application. We are now studying the
implementation of this ANN paradigm in RISC-based and
vector-processing CPUs, as well as the potential
applications of neurochip prototyping kits already available
in the market.
(Presented at the I Latinoamerican Congress on Health
Informatics, Habana, Cuba, February 1992)
--------------------------------------------------------
COMPUTER SIMULATION OF A QUANTITATIVE MODEL FOR
REFLEX EPILEPSY
R.M.E. Sabbatini
Center of Biomedical Informatics and School of
Medicine of the State University of Campinas, Brazil.
In the present study we propose a continuous, lumped-
parameter, non-linear mathematical model for explaining the
quantitatively observed characteristics of a class of
experimental reflex epilepsy, namely audiogenic seizures in
rodents, and simulate this model with a especially contrived
microcomputer program. In a first phase of the study, we
have individually stimulated 280 adult Wistar albino rats
with a 112 dB white-noise sound source, and recorded the
latency, duration and intensity values of the psychomotor
components of the audiogenic reaction: after an initial
delay one or more circular running phases usually occurs,
followed or not by complete tonic-clonic seizures. In the
second step, we performed several multivariate statistical
analyses of these data, which have revealed many properties
of the underlying neural system responsible for the crisis;
such as the independence of the running and convulsive
phases; and a scale of severity which is correlated to the
value of latencies and intensities. Finally, a lumped-
parameter model based on a set of differential equations
which describes the macro behavior of the interaction of
four different populations of excitatory and inhibitory
neurons with different time constants and threshold elements
has been simulated in a computer, In this model, running
waves, which may occur several times before leading or not
to the final convulsive phase, are explained by the
oscillatory behavior of a controlling neural population,
caused by mixed feedback: an early, internal positive
feedback which results in the growing of excitation, and a
late negative feedback elicited by motor components of the
running itself, which causes the oscillation back to
inhibition. A second, threshold-triggered population
controls the convulsive phase and its subsequent refractory
phase. The results of the simulation have been found to
explain reasonably well the time course and structural
characteristics of the several forms of rodent audiogenic
epilepsy and correlates well with the existing knowledge
about the neural bases of this phenomenon.
(Presented at the Second IBRO/IMIA International Symposium
on Mathematical Approaches to Brain Functioning Diagnostics,
Prague, Czechoslovakia, September 1990).
--------------------------------------------------------
OUTCOME PREDICTION FOR CRITICAL PATIENTS UNDER INTENSIVE
CARE, USING BACKPROPAGATION NEURAL NETWORKS
P. Felipe Jr., R.M.E. Sabbatini, P.M. Carvalho-
Jnior, R.E. Beseggio, and R.G.G. Terzi
Center for Biomedical Informatics, State University of
Campinas, Campinas SP 13081-970 Brazil
Several scores have been designed to estimate death
probability for patients admitted to Intensive Care Units,
such as the APACHE and MPM systems, which are based on
regression analysis. In the present work, we have studied
the potential of a model of artificial neural network, the
three-layer perceptron with backpropagation learning rule,
to perform this task. Training and testing data were derived
from a Brazilian database which was previously used for
calculating APACHE scores. The neural networks were
trained with physiological, clinical and pathological data
(30 variables, such as worst pCO2, coma level, arterial
pressure, etc.) based on a sample of more than 300 patients,
whose outcome was known.
All networks were able to reach convergence with a small
global prediction error. Maximum percentages of 75% correct
predictions in the test dataset and 99.6 % in the training
dataset, were achieved. Maximum sensitivity and specificity
were 60% and 80%, respectively. We conclude that the neural
network approach has worked well for outcome prognosis in a
highly "noisy" dataset, with a similar, if slightly lower
performance than APACHE II, but with the advantage of
deriving its parameters from a regional dataset instead from
an universal model.
The paper will be presented at the MEDINFO'92 workshop on
"Applications of Connectionist System in Biomedicine",
September 8, 1992, in Geneva, Switzerland.
==============================================================
Reprints/Preprints are available
Renato M.E. Sabbatini, PhD
Center for Biomedical Informatics
State University of Campinas
SABBATINI@CCVAX.UNICAMP.BR
SABBATINI@BRUC.BITNET
Distribution:
70712,3265