home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Monster Media 1993 #2
/
Image.iso
/
text
/
9305nni.zip
/
930512.ANC
< prev
next >
Wrap
Internet Message Format
|
1993-05-12
|
51KB
From ml-connectionists-request@q.cs.cmu.edu Tue May 4 16:05:30 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA04746; Tue, 4 May 93 08:32:38 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id aa10406; 3 May 93 23:45:06 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id aa10404;
3 May 93 23:27:53 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa10414;
3 May 93 23:27:13 EDT
Received: from CS.CMU.EDU by B.GP.CS.CMU.EDU id aa18336; 3 May 93 23:24:12 EDT
Received: from crl.ucsd.edu by CS.CMU.EDU id aa19883; 3 May 93 23:23:37 EDT
Received: by crl.ucsd.edu; id AA15510
sendmail 4.1/UCSD-2.1-sun
Mon, 3 May 93 20:23:25 PDT for connectionists@cs.cmu.edu
Date: Mon, 3 May 93 20:23:25 PDT
From: Jeff Elman <elman@crl.ucsd.edu>
Message-Id: <9305040323.AA15510@crl.ucsd.edu>
To: connectionists@cs.cmu.edu
Subject: new books in MIT Neural Network/Connectionsm series
Status: RO
The following books have now appeared as part of the Neural Network
Modeling and Connection Series, and may be of interest to readers of
the connectionists mailing group. Detailed descriptions of each book,
along with table of contents, follow.
Jeff Elman
============================================================
Neural Network Modeling and Connectionism Series
Jeffrey Elman, editor. MIT Press/Bradford Books.
* Miikkulainen, R. "Subsymbolic Natural Language Processing
An Integrated Model of Scripts, Lexicon, and Memory"
* Mitchell, M. "Analogy-Making as Perception A Computer Model"
* Cleeremans, A. "Mechanisms of Implicit Learning Connectionist Models
of Sequence Processing"
* Sereno, M.E. "Neural Computation of Pattern Motion Modeling Stages of
Motion Analysis in the Primate Visual Cortex"
* Miller, W.T., Sutton, R.S., & Werbos, P.J. (Eds.), "Neural Networks for
Control"
* Hanson, S.J., & Olson, C.R. (Eds.) "Connectionist Modeling and Brain
Function The Developing Interface"
* Judd, S.J. "Neural Network Design and the Complexity of Learning"
* Mozer, M.C. "The Perception of Multiple Objects A Connectionist
Approach"
------------------------------------------------------------
New
Subsymbolic Natural Language Processing
An Integrated Model of Scripts, Lexicon, and Memory
Risto Miikkulainen
Aiming to bridge the gap between low-level connectionist models and
high-level symbolic artificial intelligence, Miikkulainen describes
DISCERN, a complete natural language processing system implemented
entirely at the subsymbolic level. In DISCERN, distributed neural
network models of parsing, generating, reasoning, lexical processing,
and episodic memory are integrated into a single system that learns to
read, paraphrase, and answer questions about stereotypical narratives.
Using the DISCERN system as an example, Miikkulainen introduces a
general approach to building high-level cognitive models from
distributed neural networks, and shows how the special properties of
such networks are useful in modeling human performance. In this approach
connectionist networks are not only plausible models of isolated
cognitive phenomena, but also sufficient constituents for complete
artificial intelligence systems.
Risto Miikkulainen is an Assistant Professor in the Department of
Computer Sciences at the University of Texas, Austin.
Contents: I.Overview. Introduction. Background. Overview of DISCERN. II.
Processing Mechanisms. Backpropagation Networks. Developing
Representations in FGREP Modules Building from FGREP Modules. III.
Memory Mechanisms. Self-Organizing Feature Maps. Episodic Memory
Organization: Hierarchical Feature Maps. Episodic Memory Storage and
Retrieval: Trace Feature Maps. Lexicon. IV. Evaluation. Behavior of the
Complete Model. Discussion. Comparison to Related Work. Extensions and
Future Work. Conclusions. Appendixes: A Story Data. Implementation
Details. Instructions for Obtaining the DISCERN Software.
A Bradford Book
May 1993 - 408 pp. - 129 illus. - $45.00
0-262-13290-7 MIISH
------------------------------------------------------------
New
Analogy-Making as Perception
A Computer Model
Melanie Mitchell
Analogy-Making as Perception is based on the premise that analogy-making
is fundamentally a high-level perceptual process in which the
interaction of perception and concepts gives rise to "conceptual
slippages" which allow analogies to be made. It describes Copycat,
developed by the author with Douglas Hofstadter, that models the
complex, subconscious interaction between perception and concepts that
underlies the creation of analogies.
In Copycat, both concepts and high-level perception are emergent
phenomena, arising from large numbers of low-level, parallel,
non-deterministic activities. In the spectrum of cognitive modeling
approaches, Copycat occupies a unique intermediate position between
symbolic systems and connectionist systems - a position that is at
present the most useful one for understanding the fluidity of concepts
and high-level perception.
On one level the work described here is about analogy-making, but on
another level it is about cognition in general. It explores such issues
as the nature of concepts and perception and the emergence of highly
flexible concepts from a lower-level "subcognitive" substrate.
Melanie Mitchell, Assistant Professor in the Department of Electrical
Engineering and Computer Science at the University of Michigan, is a
Fellow of the Michigan Society of Fellows. She is also Director of the
Adaptive Computation Program at the Santa Fe Institute.
Contents: Introduction. High-Level Perception, Conceptual Slippage, and
Analogy-Making in a Microworld. The Architecture of Copycat. Copycat's
Performance on the Five Target Problems. Copycat's Performance on
Variants of the Five Target Problems. Summary of the Comparisons between
Copycat and Human Subjects. Some Shortcomings of the Model. Results of
Selected "Lesions" of Copycat. Comparisons with Related Work.
Contributions of This Research. Afterword by Douglas R. Hofstadter.
Appendixes. A Sampler of Letter-String Analogy Problems Beyond Copycat's
Current Capabilities. Parameters and Formulas. More Detailed
Descriptions of Codelet Types.
A Bradford Book
May 1993 - 382 pp. - 168 illus. - $45.00
0-262-13289-3 MITAH
------------------------------------------------------------
New
Mechanisms of Implicit Learning
Connectionist Models of Sequence Processing
Axel Cleeremans
What do people learn when they do not know that they are learning? Until
recently all of the work in the area of implicit learning focused on
empirical questions and methods. In this book, Axel Cleeremans explores
unintentional learning from an information-processing perspective. He
introduces a theoretical framework that unifies existing data and models
on implicit learning, along with a detailed computational model of human
performance in sequence-learning situations.
The model, based on a simple recurrent network (SRN), is able to predict
the successive elements of sequences generated from finite-state
grammars. Human subjects are shown to exhibit a similar sensitivity to
the temporal structure in a series of choice reaction time experiments
of increasing complexity; yet their explicit knowledge of the sequence
remains limited. Simulation experiments indicate that the SRN model is
able to account for these data in great detail. Other architectures
that process sequential material are considered. These are contrasted
with the SRN model, which they sometimes outperform. Considered
together, the models show how complex knowledge may emerge through the
operation of elementary mechanisms - a key aspect of implicit learning
performance.
Axel Cleeremans is a Senior Research Assistant at the National Fund for
Scientific Research, Belgium.
Contents: Implicit Learning: Explorations in Basic Cognition. The SRN
Model: Computational Aspects of Sequence Processing. Sequence Learning
as a Paradigm for Studying Implicit Learning. Sequence Learning: Further
Explorations. Encoding Remote Control. Explicit Sequence Learning.
General Discussion.
A Bradford Book
April 1993 - 227 pp. - 60 illus. - $30.00
0-262-03205-8 CLEMH
------------------------------------------------------------
New
Neural Computation of Pattern Motion
Modeling Stages of Motion Analysis in the Primate Visual Cortex
Margaret Euphrasia Sereno
How does the visual system compute the global motion of an object from
local views of its contours? Although this important problem in
computational vision (also called the aperture problem) is key to
understanding how biological systems work, there has been surprisingly
little neurobiologically plausible work done on it. This book describes
a neurally based model, implemented as a connectionist network, of how
the aperture problem is solved. It provides a structural account of the
model's performance on a number of tasks and demonstrates that the
details of implementation influence the nature of the computation as
well as predict perceptual effects that are unique to the model. The
basic approach described can be extended to a number of different
sensory computations.
"This is an important book, discussing a significant and very general
problem in sensory processing. The model presented is simple, and it is
elegant in that we can see, intuitively, exactly why and how it works.
Simplicity, clarity and elegance are virtues in any field, but not often
found in work in neural networks and sensory processing. The model
described in Sereno's book is an exception. This book will have a
sizeable impact on the field." - James Anderson, Professor, Department
of Cognitive and Linguistic Sciences, Brown University
Contents: Introduction. Computational, Psychophysical, and
Neurobiological Approaches to Motion Measurement. The Model. Simulation
Results. Psychophysical Demonstrations. Summary and Conclusions.
Appendix: Aperture Problem Linearity.
A Bradford Book
March 1993 - 181 pp.- 41 illus. - $24.95
0-262-19329-9 SERNH
------------------------------------------------------------
Neural Networks for Control
edited by W. Thomas Miller, III, Richard S. Sutton,
and Paul J. Werbos
This book brings together examples of all of the most important
paradigms in artificial neural networks (ANNs) for control, including
evaluations of possible applications. An appendix provides complete
descriptions of seven benchmark control problems for those who wish to
explore new ideas for building automatic controllers.
Contents: I.General Principles. Connectionist Learning for Control: An
Overview, Andrew G. Barto. Overview of Designs and Capabilities, Paul J.
Werbos. A Menu of Designs for Reinforcement Learning Over Time, Paul J.
Werbos. Adaptive State Representation and Estimation Using Recurrent
Connectionist Networks, Ronald J. Williams. Adaptive Control using
Neural Networks, Kumpati S. Narendra. A Summary Comparison of CMAC
Neural Network and Traditional Adaptive Control Systems, L. Gordon
Kraft, III, and David P. Campagna. Recent Advances in Numerical
Techniques for Large Scale Optimization, David F. Shanno. First Results
with Dyna, An Integrated Architecture for Learning, Planning and
Reacting, Richard S. Sutton.
II. Motion Control. Computational Schemes and Neural Network Models for
Formation and Control of Multijoint Arm Trajectory, Mitsuo Kawato.
Vision-Based Robot Motion Planning, Bartlett W. Mel. Using Associative
Content-Addressable Memories to Control Robots, Christopher G. Atkeson
and David J. Reinkensmeyer. The Truck Backer-Upper: An Example of
Self-Learning in Neural Networks, Derrick Nguyen and Bernard Widrow. An
Adaptive Sensorimotor Network Inspired by the Anatomy and Physiology of
the Cerebellum, James C. Houk, Satinder P. Singh, Charles Fisher, and
Andrew G. Barto. Some New Directions for Adaptive Control Theory in
Robotics, Judy A. Franklin and Oliver G. Selfridge.
III. Application Domains. Applications of Neural Networks in Robotics
and Automation for Manufacturing, Arthur C. Sanderson. A Bioreactor
Benchmark for Adapive Network-based Process Control, Lyle H. Ungar. A
Neural Network Baseline Problem for Control of Aircraft Flare and
Touchdown, Charles C. Jorgensen and C. Schley. Intelligent Conrol for
Multiple Autonomous Undersea Vehicles, Martin Herman, James S. Albus,
and Tsai-Hong Hong. A Challenging Set of Control Problems, Charles W.
Anderson and W. Thomas Miller.
A Bradford Book
1990 - 524 pp. - $52.50
0-262-13261-3 MILNH
------------------------------------------------------------
Connectionist Modeling and Brain Function
The Developing Interface
edited by Stephen Jose Hanson and Carl R. Olson
This tutorial on current research activity in connectionist-inspired
biology-based modeling describes specific experimental approaches and
also confronts general issues related to learning, associative memory,
and sensorimotor development.
"This volume makes a convincing case that data-rich brain scientists and
model-rich cognitive psychologists can and should talk to one another.
The topics they discuss together here - memory and perception - are of
vital interest to both, and their collaboration promises continued
excitement along this new scientific frontier." - George Miller,
Princeton University
Contents: Part I: Overview. Introduction: Connectionism and
Neuroscience, S. J. Hanson and C. R. Olson. Computational Neuroscience,
T. J. Sejnowski, C. Koch, and P. S. Churchland. Part II: Associative
Memory and Conditioning. The Behavioral Analysis of Associative Learning
in the Terrestrial Mollusc Limax Maximus: The Importance of Inter-event
Relationships, C. L. Sahley. Neural Models of Classical Conditioning: A
Theoretical Viewpoint, G. Tesauro. Unsupervised Perceptual Learning: A
Paleocortical Model, R. Granger, J. Ambros-Ingerson, P. Anton, and G.
Lynch. Part III. The Somatosensory System. Biological Constraints on a
Dynamic Network: The Somatosensory Nervous System, T. Allard. A Model of
Receptive Field Plasticity and Topographic Reorganization in the
Somatosensory Cortex, L. H. Finkel. Spatial Representation of the Body,
C. R. Olson and S. J. Hanson. Part IV: The Visual System. The
Development of Ocular Dominance Columns: Mechanisms and Models. K. D.
Miller and M. P. Stryker. Self- Organization in a Perceptual System: How
Network Models and Information Theory May Shed Light on Neural
Organization, R. Linsker. Solving the Brightness-From-Luminance Problem:
A Neural Architecture for Invariant Brightness Perception, S. Grossberg
and D. Todorovic.
A Bradford Book
1990 - 423 pp. - $44.00
0-262-08193-8 HANCH
------------------------------------------------------------
Neural Network Design and the Complexity of Learning
J. Stephen Judd
Using the tools of complexity theory, Stephen Judd develops a formal
description of associative learning in connectionist networks. He
rigorously exposes the computational difficulties in training neural
networks and explores how certain design principles will or will not
make the problems easier.
"Judd . . . formalized the loading problem and proved it to be
NP-complete. This formal work is clearly explained in his book in such a
way that it will be accessible both to the expert and nonexpert." - Eric
B. Baum, IEEE Transactions on Neural Networks
"Although this book is the true successor to Minsky and Papert's
maligned masterpiece of 1969 (Perceptrons), Judd is not trying to
demolish the field of neurocomputing. His purpose is to clarify the
limitations of a wide class of network models and thereby suggest
guidelines for practical applications." - Richard Forsyth, Artificial
Intelligence & Behavioral Simulation
Contents: Neural Netowrks: Hopes, Problems, and Goals. The Loading
Problem. Other Studies of Learning. The Intractability of Loading.
Subcases. Shallow Architectures. Memorization and Generalization.
Conclusions. Appendices
A Bradford Book
1990 - 150 pp. - $27.50
0-262-10045-2 JUDNH
------------------------------------------------------------
The Perception of Multiple Objects
A Connectionist Approach
Michael C. Mozer
Building on the vision studies of David Marr and the connectionist
modeling of the PDP group it describes a neurally inspired computational
model of two-dimensional object recognition and spatial attention that
can explain many characteristics of human visual perception. The model,
called MORSEL, can actually recognize several two-dimensional objects at
once (previous models have tended to blur multiple objects into one or
to overload). Mozer's is a fully mechanistic account, not just a
functional-level theory.
"Mozer's work makes a major contribution to the study of visual
information processing. He has developed a very creative and
sophisticated new approach to the problem of visual object recognition.
The combination of computational rigor with thorough and knowledgeable
examination of psychological results is impressive and unique." - Harold
Pashler, University of California at San Diego
Contents: Introduction. Multiple Word Recognition. The Pull-Out Network.
The Attentional Mechanism. The Visual Short-Term Memory. Psychological
Phenomena Explained by MORSEL. Evaluation of MORSEL. Appendixes: A
Comparison of Hardware Requirements. Letter Cluster Frequency and
Discriminability Within BLIRNET's Training Set.
A Bradford Book
1991 - 217 pp - $27.50
0-262-13270-2 MOZPH
-------------------------------------------------------------
ORDER FORM
Please send me the following book(s):
Qty Author Bookcode Price
___ Cleeremans CLEMH 30.00
___ Hanson HANCH 44.00
___ Judd JUDNH 27.50
___ Mikkulainen MIISH 45.00
___ Miller MILNH 52.50
___ Mitchell MITAH 45.00
___ Mozer MOZPH 27.50
___ Sereno SERNH 24.95
___ Payment Enclosed ___ Purchase Order Attached
Charge to my ___ Master Card ___ Visa
Card# _______________________________
Exp.Date _______________
Signature _________________________________________________
_____ Total for book
$2.75 Postage
_____ Please add 50c postage for each additional book
_____ Canadian customers Add 7% GST
_____ TOTAL due MIT Press
Send To:
Name ______________________________________________________
Address ___________________________________________________
City ________________________ State ________ Zip __________
Daytime Phone ________________ Fax ________________________
Make checks payable and send order to:
The MIT Press * 55 Hayward Street * Cambridge, MA 02142
For fastest service call (617) 625-8569
or toll-free 1-800-356-0343
The MIT Guarantee: If for any reason you are not completely satisfied,
return your book(s) within ten days of receipt for a full refund or
credit.
3ENET
From ml-connectionists-request@q.cs.cmu.edu Tue May 4 16:05:39 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA00804; Tue, 4 May 93 03:33:45 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id aa09739; 3 May 93 18:17:04 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id aa09677;
3 May 93 17:32:49 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa10120;
3 May 93 17:31:15 EDT
Received: from CS.CMU.EDU by B.GP.CS.CMU.EDU id aa13168; 3 May 93 15:26:40 EDT
Received: from acquine.cns.caltech.edu by CS.CMU.EDU id aa16873;
3 May 93 15:25:26 EDT
Received: from plato.cns.caltech.edu by cns.caltech.edu (4.1/1.34)
id AA16477; Mon, 3 May 93 12:15:00 PDT
Received: by plato.cns.caltech.edu id <AA20897@plato.cns.caltech.edu>; Mon, 3 May 93 12:30:55 PDT
Date: Mon, 3 May 93 12:30:55 PDT
From: Bartlett Mel <mel@cns.caltech.edu>
Message-Id: <9305031930.AA20897@plato.cns.caltech.edu>
To: connectionists@cs.cmu.edu
Subject: NIPS*93: Deadline May 22
Status: RO
******************** FINAL REMINDER, NOTE DEADLINE OF MAY 22 *****************
CALL FOR PAPERS
Neural Information Processing Systems
-Natural and Synthetic-
Monday, November 29 - Thursday, December 2, 1993
Denver, Colorado
This is the seventh meeting of an inter-disciplinary conference
which brings together neuroscientists, engineers, computer scien-
tists, cognitive scientists, physicists, and mathematicians in-
terested in all aspects of neural processing and computation.
There will be an afternoon of tutorial presentations (Nov 29)
preceding the regular session and two days of focused workshops
will follow at a nearby ski area (Dec 3-4).
Major categories and examples of subcategories for paper submis-
sions are the following:
Neuroscience: Studies and Analyses of Neurobiological Systems,
Inhibition in cortical circuits, Signals and noise in neural
computation, Computational and Theoretical Neurobiology, Neu-
rophysics.
Theory: Computational Learning Theory, Complexity Theory,
Dynamical Systems, Statistical Mechanics, Probability and
Statistics, Approximation Theory.
Implementation and Simulation: VLSI, Optical, Software Simula-
tors, Implementation Languages, Parallel Processor Design and
Benchmarks.
Algorithms and Architectures: Learning Algorithms, Construc-
tive and Pruning Algorithms, Localized Basis Functions, Tree
Structured Networks, Performance Comparisons, Recurrent Net-
works, Combinatorial Optimization, Genetic Algorithms.
Cognitive Science & AI: Natural Language, Human Learning and
Memory, Perception and Psychophysics, Symbolic Reasoning.
Visual Processing: Stereopsis, Visual Motion, Recognition, Im-
age Coding and Classification.
Speech and Signal Processing: Speech Recognition, Coding, and
Synthesis, Text-to-Speech, Adaptive Equalization, Nonlinear
Noise Removal.
Control, Navigation, and Planning: Navigation and Planning,
Learning Internal Models of the World, Trajectory Planning,
Robotic Motor Control, Process Control.
Applications: Medical Diagnosis or Data Analysis, Financial
and Economic Analysis, Timeseries Prediction, Protein Struc-
ture Prediction, Music Processing, Expert Systems.
Technical Program: Plenary, contributed and poster sessions will
be held. There will be no parallel sessions. The full text of
presented papers will be published.
Submission Procedures: Original research contributions are soli-
cited, and will be carefully refereed. Authors must submit six
copies of both a 1000-word (or less) summary and six copies of a
separate single-page 50-100 word abstract clearly stating their
results postmarked by May 22, 1993 (express mail is not neces-
sary). Accepted abstracts will be published in the conference
program. Summaries are for program committee use only. At the
bottom of each abstract page and on the first summary page indi-
cate preference for oral or poster presentation and specify one
of the above nine broad categories and, if appropriate, sub-
categories (For example: Poster, Applications-Expert Systems;
Oral, Implementation-Analog VLSI). Include addresses of all au-
thors at the front of the summary and the abstract and indicate
to which author correspondence should be addressed. Submissions
will not be considered that lack category information, separate
abstract sheets, the required six copies, author addresses, or
are late.
Mail Submissions To:
Gerry Tesauro
NIPS*93 Program Chair
The Salk Institute, CNL
10010 North Torrey Pines Rd.
La Jolla, CA 92037
Mail For Registration Material To:
NIPS*93 Registration
NIPS Foundation
PO Box 60035
Pasadena, CA 91116-6035
All submitting authors will be sent registration material au-
tomatically. Program committee decisions will be sent to the
correspondence author only.
NIPS*93 Organizing Committee: General Chair, Jack Cowan, Univer-
sity of Chicago; Publications Chair, Joshua Alspector, Bellcore;
Publicity Chair, Bartlett Mel, CalTech; Program Chair, Gerry
Tesauro, IBM/Salk Institute; Treasurer, Rodney Goodman, CalTech;
Local Arrangements, Chuck Anderson, Colorado State Universi-
ty; Tutorials Chair, Dave Touretzky, Carnegie-Mellon, Workshop
Chair, Mike Mozer, University of Colorado; Program Co-Chairs:
Larry Abbott, Brandeis Univ, Chris Atkeson, MIT; A. B. Bonds,
Vanderbilt Univ; Gary Cottrell, UCSD; Scott Fahlman, CMU; Rod
Goodman, Caltech; John Hertz, NORDITA/NIH; John Lazzaro, UC
Berkeley; Todd Leen, OGI; Jay McClelland, CMU; Nelson
Morgan,ICSI; Steve Nowlan, Salk Inst./Synaptics; Misha Pavel,
NASA/OGI; Sandy Pentland, MIT; Tom Petsche, Siemens. Domestic
Liasons: IEEE Liaison, Terrence Fine, Cornell; Government & Cor-
porate Liaison, Lee Giles, NEC Research Institute Inc.; Overseas
Liasons: Mitsuo Kawato, ATR; Marwan Jabri, University of Sydney;
Gerard Dreyfus, Ecole Superieure, Paris; Alan Murray, University
of Edinburgh; Andreas Meier, Simon Bolivar U.
DEADLINE FOR SUMMARIES & ABSTRACTS IS MAY 22, 1993 (POSTMARKED)
please post
From ml-connectionists-request@q.cs.cmu.edu Wed May 5 01:10:11 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA16822; Wed, 5 May 93 03:10:03 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id aa14694; 5 May 93 0:45:34 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id aa14630;
5 May 93 0:13:07 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa11320;
5 May 93 0:12:12 EDT
Received: from CS.CMU.EDU by B.GP.CS.CMU.EDU id aa27382; 4 May 93 15:31:41 EDT
Received: from hermes.chpc.utexas.edu by CS.CMU.EDU id aa24896;
4 May 93 15:30:42 EDT
Received: from morpheus.chpc.utexas.edu by hermes.chpc.utexas.edu (5.64/SMI-3.2)
id AA29456; Tue, 4 May 93 14:30:34 -0500
Received: by morpheus.chpc.utexas.edu (4.1/SMI-4.1)
id AA01085; Tue, 4 May 93 14:30:33 CDT
From: mwitten@hermes.chpc.utexas.edu
Message-Id: <9305041930.AA01085@morpheus.chpc.utexas.edu>
Subject: WORLD CONGRESS ON COMPUTATIONAL MEDICINE<-CFPP
To: connectionists@cs.cmu.edu
Date: Tue, 4 May 93 14:30:32 CDT
X-Mailer: ELM [version 2.3 PL11]
Status: R
[] ***** CALL FOR PAPERS AND PARTICIPATION ***** []
FIRST WORLD CONGRESS ON COMPUTATIONAL MEDICINE AND PUBLIC HEALTH
24-28 April 1994
Hyatt Regency Hotel
Austin, Texas
compmed94@chpc.utexas.edu
(this notice may be reposted/cross posted/circulated)
------------------------------------------------------------------------
*Conference Chair: Matthew Witten, UT System Center For High Performance
Computing, Austin, Texas - m.witten@chpc.utexas.edu
*Conference Directorate: Regina Monaco, Mt. Sinai Medical Center * Dan
Davison, University of Houston * Chris Johnson, University of
Utah * Lisa Fauci, Tulane University * Daniel Zelterman,
University of Minnesota Minneapolis * James Hyman, Los Alamos
National Laboratory * Richard Hart, Tulane University * Dennis
Duke, SCRI-Florida State University * Sharon Meintz,
University of Nevada Los Vegas * Dean Sittig, Vanderbilt
University * Dick Tsur, World Bank and UT System CHPC *
Dan Deerfield, Pittsburgh Supercomputing Center * Istvan
Gyori, Szeged University School of Medicine Computing Center
*Conference Theme: The appearance of high-performance computing environments
has greatly enhanced the capabilities of the biomedical modeler. With
increasing frequency, computational sciences are being exploited as a means
with which to investigate biomedical processes at all levels of complexity,
from molecular to systemic to demographic. The emergence of an increasing
number of players in this field has lead to the subsequent emergence of a
new transdisciplinary field which we call Computational Medicine and Public
Health. The purpose of this congress is to bring together a transdisciplinary
group of researchers in medicine, public health, computer science, mathematics,
nursing, veterinary medicine, ecology, allied health, as well as numerous
other disciplines, for the purposes of examining the grand challenge problems
of the next decades.
Young scientists are encouraged to attend and to present their work in this
increasingly interesting discipline. Funding is being solicited from NSF,
NIH, DOE, Darpa, EPA, and private foundations, as well as
other sources to assist in travel support and in the offsetting of expenses
for those unable to attend otherwise. Papers, poster presentations, tutorials,
focussed topic workshops, birds of a feather groups, demonstrations, and other
suggestions are solicited in, but are not limited to the following areas:
*Visualization/Sonification
--- medical imaging
--- molecular visualization as a clinical research tool
--- simulation visualization
--- microscopy
--- visualization as applied to problems arising in computational
molecular biology and genetics or other non-traditional disciplines
*Computational Molecular Biology and Genetics
--- computational ramifications of clinical needs in the Human Genome,
Plant Genome, and Animal Genome Projects
--- computational and grand challenge problems in
molecular biology and genetics
--- algorithms and methodologies
--- issues of multiple datatype databases
*Computational Pharmacology, Pharmacodynamics, Drug Design
*Computational Chemistry as Applied to Clinical Issues
*Computational Cell Biology, Physiology, and Metabolism
--- Single cell metabolic models (red blood cell)
--- Cancer models
--- Transport models
--- Single cell interaction with external factors models (laser,
ultrasound, electrical stimulus)
*Computational Physiology and Metabolism
--- Renal System
--- Cardiovascular dynamics
--- Liver function
--- Pulmonary dynamics
--- Auditory function, coclear dynamics, hearing
--- Reproductive modeling: ovarian dynamics, reproductive
ecotoxicology, modeling the hormonal cycle
--- Metabolic Databases and metabolic models
*Computational Demography, Epidemiology, and Statistics/Biostatistics
--- Classical demographic, epidemiologic, and biostatistical modeling
--- Modeling of the role of culture, poverty, and other
sociological issues as they impact healthcare
*Computational Disease Modeling
--- AIDS
--- TB
--- Influenza
--- Other
*Computational Biofluids
--- Blood flow
--- Sperm dynamics
--- Modeling of arteriosclerosis
*Computational Dentistry, Orthodontics, and Prosthetics
*Computational Veterinary Medicine
--- Computational issues in modeling non-human dynamics such
as equine, feline, canine dynamics (physiological/biomechanical)
*Computational Allied Health Sciences
--- Physical Therapy
--- Neuromusic Therapy
--- Resiratory Therapy
*Computational Radiology
--- Dose modeling
--- Treatment planning
*Computational Surgery
--- Simulation of surgical procedures in VR worlds
--- Surgical simulation as a precursor to surgical intervention
*Computational Cardiology
*Computational Neurobiology and Neurophysiology
--- Brain modeling
--- Single neuron models
--- Neural nets and clinical applications
--- Neurophysiological dynamics
--- Neurotransmitter modeling
--- Neurological disorder modeling (Alzheimers Disease, for example)
*Computational Biomechanics
--- Bone Modeling
--- Joint Modeling
*The role of alternate reality methodologies
and high performance environments in the medical and
public health disciplines
*Issues in the use of high performance computing
environments in the teaching of health science
curricula
*The role of high performance environments
for the handling of large medical datasets (high
performance storage environments, high performance
networking, high performance medical records
manipulation and management, metadata structures
and definitions)
*Federal and private support for transdisciplinary research
in computational medicine and public health
*Contact: To contact the congress organizers for any reason
use any of the following
Electronic Mail - compmed94@chpc.utexas.edu
Fax (USA) - (512) 471-2445
Phone (USA) - (512) 471-2472
Compmed 1994
University of Texas System CHPC
Balcones Research Center, 1.154CMS
10100 Burnet Road
Austin, Texas 78758-4497
*Submission Procedures: Authors must submit 5 copies
of a single-page 50-100 word abstract clearly discussing the
topic of their presentation. In addition, authors must clearly
state their choice of poster, contributed paper, tutorial, exhibit,
focussed workshop or birds of a feather group along with a
discussion of their presentation. Abstracts will be published
as part of the preliminary conference material.
To notify the congress organizing committee that you would like to
participate and to be put on the congress mailing list,
please fill out and return the form that follows this announcement. You may use any of the contact methods above.
*Conference Deadlines: The following deadlines should be noted:
1 October 1993 - Notification of interest in participation
1 November 1993 - Abstracts for talks/posters/workshops/birds of a
feather sessions/demonstrations
15 January 1994 - Notification of acceptance of abstract
15 February 1994 - Application for financial aid
============================= INTENT TO PARTICIPATE ==========================
First Name:
Middle Initial (if available):
Family Name:
Your Professional Title:
[ ]Dr.
[ ]Professor
[ ]Mr.
[ ]Mrs.
[ ]Ms.
[ ]Other:__________________
Office Phone (desk):
Office Phone (message):
Home/Evening Phone (for emergency contact):
Fax:
Electronic Mail (Bitnet):
Electronic Mail (Internet):
Postal Address:
Institution or Center:
Building Code:
Mail Stop:
Street Address1:
Street Address2:
City:
State:
Country:
Zip or Country Code:
Please list your three major interest areas:
Interest1:
Interest2:
Interest3:
===================================================================
From ml-connectionists-request@q.cs.cmu.edu Wed May 5 17:08:32 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA12017; Wed, 5 May 93 19:08:22 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id aa17419; 5 May 93 15:52:56 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id aa17417;
5 May 93 15:30:36 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa11798;
5 May 93 15:29:45 EDT
Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id ab06996; 5 May 93 12:01:03 EDT
Received: from udsab.dia.unisa.it by EDRC.CMU.EDU id aa17512;
5 May 93 11:59:49 EDT
Received: by udsab.dia.unisa.it id AA21642
(5.65c+/IDA-1.4.4 for Connectionists@cs.cmu.edu); Wed, 5 May 1993 18:01:02 +0200
Date: Wed, 5 May 1993 18:01:02 +0200
From: Tagliaferri Roberto <robtag@udsab.dia.unisa.it>
Message-Id: <199305051601.AA21642@udsab.dia.unisa.it>
To: Connectionists@cs.cmu.edu
Subject: WIRN 93 Programme
Status: RO
Istituto Internazionale Alti Studi Scientifici (IIASS)
Dipartimento di Fisica Teorica, Universita` di Salerno
Dipartimento di Informatica ed Applicazioni, Universita` di Salerno
Dipartimento di Scienze dell'Informazione, Universita` di Milano
Istituto per la Ricerca dei Sistemi Informatici Paralleli, C.N.R., Napoli
Societa` Italiana Reti Neuroniche (SIREN)
6th ITALIAN WORKSHOP ON
NEURAL NETWORKS
WIRN VIETRI-93
IIASS Research Center
Ph: +39 89 761167
FAX:+39 89 761189
Vietri Sul Mare, Salerno, May 12-14, 1993
PRELIMINARY PROGRAM
Wednesday 12
9:00 Opening of the Workshop
9:30 S. Gielen (Invited Lecture)
11:00 Coffee break
11:30 Formal Models and Pattern Recognition
G. Basti, V. Bidoli et al. "Particle recognition on experimental data in a silicon calorimeter by back propagation with stochastic pre-processing"
A. Borghese "Learning optimal control using neural networks"
S. Brofferio, V. Rampa "A supervised-ART neural network for pattern recognition"
P. Pedrazzi "On self-organizing neural character recognizers"
V. Sanguineti, P. Morasso "Models of cortical maps"
L. Stringa "Experiments in memory-based learning"
13:00 Lunch break
15:00 Prof. Tredici (Review Lecture on Progresses in Neuroanatomy)
16:00 Applications (1st part)
E. Coccorese, R. Martone, C. Morabito "Classification of plasma equilibria in a tokamak using a three-level propagation network"
E.D. Di Claudio, G. Trivelloni, G. Orlandi "Model identification of non linear dynamical systems by recurrent neural networks"
P. Morasso, A. Pareto, S. Pagliano, V. Sanguineti "A self-organizing approach for diagnostic problems"
17:00 Coffee Break
17:30 Hybrid and Robotic Systems
A. Chella, U. Maniscalco, R. Pirrone, F. Sorbello, P. Storniolo "A shape from shading hybrid approach to estimate superquadric parameters"
Z.M. Kovacs-V., R. Guerrieri, G. Baccarini "A hybrid system for handprinted character recognition"
A. Sperduti, A. Starita "Modular neural codes implementing neural trees"
Thursday 13
9:00 L. Zadeh (Invited Lecture)
11:00 Coffee Break
11:30 Fuzzy neural systems
E. Binaghi, A. Mazzetti, R. Orlando, A. Rampini "Integration of fuzzy reasoning techniques in the error back propagation learning algorithm"
M. Costa, E. Pasero "FNC: a fuzzy neural classifier with bayesian engine"
Zhiling Wiang, G. Sylos Labini "A self-organizing network of alterable competitive layer for pattern cluster"
13:00 Lunch Break
15:00 V. Cimagalli (Review Lecture on Cellular Networks)
16:00 - 17:30 Poster and Industrial Sessions
17:30 SIREN Annual Meeting
Friday 14
9:00 Y. Bengio, P. Frasconi and M. Gori (Review Lecture on Recurrent Networks for Adaptive Temporal Reasoning)
10:00 Applications (2nd part)
S. Cavalieri, A. Fichera "Exploiting neural network features to model and analyze noise pollution"
A.M. Colla, N. Longo, G. Morgavi, S. Ridella "SBP: A hybrid neural model for pattern recognition"
F. Piglione, G. Cirrincione "Neural-net based load-flow models for electric power systems"
11:00 Coffee Break
11:30 Hardware and Software Design
A. d'Acierno, R. Vaccaro "The back-propagation learning algorithm on parallel computers: a mapping scheme"
M. Gioiello, G. Vassallo, F. Sorbello "A new fully digital feed-forward network for hand-written digits recognition"
F. Lauria, M. Sette "CONNET: a neural network configuration language"
P. Wilke "Simulation of neural networks in a distributed computing environment using Neuro Graph"
13:00 Lunch Break
15:00 Architectures and Algorithms
M. Alberti, P. Marelli, R. Posenato " A neural algorithm for the maximum satisfiability problem"
E. Alpaydin "Multiple networks for function learning"
D. Micci Barreca, G.C. Buttazzo "A neural architecture for failure-based learning"
M. Schmitt "On the size of weights for McCulloch-Pitts neurons"
Registration fee 275.000 Italian Liras (including proceedings and
social dinner).
No fees to be payed for students.
From ml-connectionists-request@q.cs.cmu.edu Tue May 11 20:00:42 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA04075; Tue, 11 May 93 22:00:30 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id aa20586; 11 May 93 19:08:47 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id aa20410;
11 May 93 18:19:00 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa15954;
11 May 93 18:18:02 EDT
Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa07333;
11 May 93 12:58:03 EDT
Received: from ucsd.edu by EDRC.CMU.EDU id aa02990; 11 May 93 12:57:29 EDT
Received: from sunshine.ucsd.edu by ucsd.edu; id AA12763
sendmail 5.67/UCSD-2.2-sun via SMTP
Tue, 11 May 93 09:57:25 -0700 for connectionists@cs.cmu.edu
Received: by sunshine.ucsd.edu (4.1/UCSDPSEUDO.4)
id AA29071 for connectionists@cs.cmu.edu; Tue, 11 May 93 09:57:21 PDT
Date: Tue, 11 May 93 09:57:21 PDT
From: Fogel <fogel@ece.ucsd.edu>
Message-Id: <9305111657.AA29071@sunshine.ucsd.edu>
To: connectionists@cs.cmu.edu
Subject: Email Digest for Evolutionary Programming
Status: RO
ANNOUNCING
EVOLUTIONARY PROGRAMMING EMAIL DIGEST
We are pleased to announce that as of May 10, 1993, an email digest
covering transactions on evolutionary programming will be available. The
digest is intended to promote discussions on a wide range of technical
issues in evolutionary optimization, as well as provide information on
upcoming conferences, events, journals, special issues, and other items of
interest to the EP community. Discussions on all areas of evolutionary
computation are welcomed, including artificial life, evolution strategies,
and genetic algorithms. The digest is meant to encourage interdisciplinary
communications. Your suggestions and comments regarding the digest are
always welcome.
To subscribe to the digest, send mail to ep-list-request@magenta.me.fau.edu
and include the line "subscribe ep-list" in the body of the text. Further
instructions will follow your subscription.
The digest will be moderated by N. Saravanan of Florida Atlantic University.
Sincerely,
David Fogel
fogel@sunshine.ucsd.edu
N. Saravanan
saravan@amber.me.fau.edu
From ml-connectionists-request@q.cs.cmu.edu Fri May 7 21:18:32 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA02125; Fri, 7 May 93 23:18:15 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id aa26005; 7 May 93 18:50:48 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id aa25953;
7 May 93 18:08:06 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa13360;
7 May 93 18:07:15 EDT
Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa29738; 7 May 93 11:37:35 EDT
Received: from dendrite.cis.ohio-state.edu by EDRC.CMU.EDU id aa25466;
7 May 93 11:37:03 EDT
Received: by dendrite.cis.ohio-state.edu (5.61-kk/5.911008)
id AA07540; Fri, 7 May 93 11:44:36 -0400
Date: Fri, 7 May 93 11:44:36 -0400
From: Jordan B Pollack <pollack@cis.ohio-state.edu>
Message-Id: <9305071544.AA07540@dendrite.cis.ohio-state.edu>
To: connectionists@CS.CMU.EDU
Subject: Tweaking NEUROPROSE
Reply-To: pollack@cis.ohio-state.edu
Ftp-Host: archive.cis.ohio-state.edu
Ftp-File: pub/neuroprose/README
Status: RO
*****do not forward to other groups*****
Good People,
There are problems of scale with NEUROPROSE, and no resources to fix
them properly. Therefore, after great thought about the laws of
unintended consequences, and with no insult intended to recent
articles, I am hereby tweaking the practices of NEUROPROSE, and I
trust you will all go along with me eventually:
1. No more multiple daily submissions, NEUROPROSE is supposed to be
for relevant preprints, not a vanity press or a medium for the
distribution of life works or annual reports.
2. Make sure your paper is single-spaced, even as a draft,
so as to save paper.
3. Please announce the NUMBER OF PAGES with with the announcement, so
people are not surprised by empty laser printer trays. In your request
to me, it would help to have a formatted INDEX entry with the page
count as well (see appendix).
4. Before announcing, have a friend at another institution retrieve
and print the file, so as to avoid easily found local postscript
library errors. Lots of resource are wasted when the files
do not print.
5. Add the following two lines to your mail header, or the top of your
message, so as to facilitate the development of mailer scripts and
macros which can automatically retrieve files from both NEUROPROSE and
other lab-specific repositories (Thanks to Dave Plaut's sense of humor):
FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/filename.ps.Z
6. Finally, unless you are posting a file with non-standard ftp arrangements,
like a tar.Z file, leave the instructions off, as everyone knows at
this point how to get and uncompress and print a postscript file!
I have amended the README file to this effect. Please send comments to
me for discussion, rather than the whole mailing list.
Thanks.
Jordan Pollack Assistant Professor
CIS Dept/OSU Laboratory for AI Research
2036 Neil Ave Email: pollack@cis.ohio-state.edu
Columbus, OH 43210 Phone: (614)292-4890 (then * to fax)
From ml-connectionists-request@q.cs.cmu.edu Tue May 11 16:26:04 1993
Received: by cse.uta.edu (5.57/Ultrix2.4-C)
id AA01094; Tue, 11 May 93 18:25:56 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id aa20403; 11 May 93 18:40:43 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by Q.CS.CMU.EDU id aa20398;
11 May 93 18:15:51 EDT
Received: from DST.BOLTZ.CS.CMU.EDU by DST.BOLTZ.CS.CMU.EDU id aa15945;
11 May 93 18:15:06 EDT
Received: from EDRC.CMU.EDU by B.GP.CS.CMU.EDU id aa03752; 11 May 93 7:00:36 EDT
Received: from signal.dra.hmg.gb by EDRC.CMU.EDU id aa01941;
11 May 93 6:59:59 EDT
Received: from liszt.dra.hmg.gb by signal.dra.hmg.gb (5.65b/)
id AA14693; Tue, 11 May 93 11:59:51 +0100
Received: from milne by liszt.dra.hmg.gb (5.65b+/)
id AA13508; Tue, 11 May 93 12:00:55 +0100
From: Mike Wynne-Jones <mikewj@signal.dra.hmg.gb>
Date: Tue, 11 May 93 12:00:55 +0100
Received: by milne.dra.hmg.gb (5.65b+/)
id AA08130; Tue, 11 May 93 12:00:55 +0100
Message-Id: AA08130@milne.dra.hmg.gb
To: connectionists@cs.cmu.edu
Subject: Neural nets applications meeting in UK
Status: RO
***********************************
NEURAL COMPUTING APPLICATIONS FORUM
***********************************
23 - 24 June 1993
Fitzwilliam College, Cambridge University, UK
*****************************************
PRACTICAL APPLICATIONS OF NEURAL NETWORKS
*****************************************
Neural Computing Applications Forum is the primary meeting place for
people developing Neural Network applications in industry and
academia. It has 150 members from the UK and Europe, from
universities, small companies and big ones, and holds four main
meeting each year. It has been running for 3 years, and is cheap to
join.
This meeting spans two days with informal workshops on 23 June and the
main meeting comprising talks about neural network techniques and
applications on 24 June.
*********
WORKSHOPS - these talks are planned; additional short talks are sought.
*********
**********************************************************
Constructing structured networks of Radial Basis Functions
23 June, 13.00 to 15.00
**********************************************************
Including :
Robert Debenham (Logica Cambridge):
"Online construction of RBFs during training"
Richard Bostock (Aston University):
"Bump-tree construction by genetic algorithms"
*********************************************************
Self Organising Networks
23 June, 15.30 to 17.30
*********************************************************
Including:
Nigel Allinson (York University):
"Self Organising Networks: fast training, case studies and digital
implementations"
************************************************************
Evening: Punting on the Cam followed by liquid refreshments!
************************************************************
*****************************
MAIN MEETING - 24 June 1993
*****************************
8.30 Registration
9.05 Welcome
9.15 Douglas Kell (university of Wales):
"Detection of impurities in olive oil"
9.55 Mahesan Niranjan (University of Cambridge):
"On-line learning algorithms for prediction and
control applications"
10.30 Coffee
11.00 Tony Robinson (University of Cambridge):
"Application of recurrent nets to phone probability
estimation in speech recognition"
11.40 Prof. Cabrol-Bass (LARTIC, France):
"Indices for the Evaluation of Neural Network Performance
as classifiers: Application to Structural Elucidation
in Infra Red Spectroscopy"
12.15 Lunch
2.00 Stephen Roberts (Oxford University):
"Probabilistic Growth of RBFs for detection of novelty"
2.40 Dave Cressy (Logica Cambridge Research):
"Neural Control of an Experimental Batch Distillation Column"
3.15 Tea
3.40 Tom Harris (Brunel University):
"Kohonen nets in machine health monitoring"
4.10 Discussions
4.30 Close
ACCOMODATION is available in Fitzwilliam college at 30 pounds (single)
and 47 pounds (twin), and **MUST** be booked and paid for in advance.
There are also lots of hotels in Cambridge.
*****************
Application
*****************
Members of NCAF get free entry to all meetings for a year. (This is
very good value - main meetings, tutorials, special interest
meetings). It also includes subscription to Springer Verlag's
new(ish) journal "Neural Computing and Applications".
Full membership: 250 pounds.
- anybody in your cmall company / research group in big company.
Individual membership: 140 pounds
- named individual only.
Student membership (with journal): 55 pounds
- copy of student ID required.
Student membership (no journal, very cheap!): 25 pounds
- copy of student ID required.
Entry to this meeting without membership costs 35 pounds for the
workshops, and 80 pounds for the main day.
Payment in advance if possible; 5 pounds charge for issue of invoice if credit is required; need an official order number.
Email enquiries to Mike Wynne-Jones, mikewj@signal.dra.hmg.gb.
Postal to Mike Wynne-Jones, NCAF, PO Box 62, Malvern, WR14 4NU, UK.
Fax to Mike Wynne-Jones, (+44/0) 684 894384