home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!olivea!hal.com!decwrl!sdd.hp.com!usc!snorkelwacker.mit.edu!ai-lab!life.ai.mit.edu!tmb
- From: tmb@arolla.idiap.ch (Thomas M. Breuel)
- Newsgroups: comp.ai.neural-nets
- Subject: Re: Characterizing learnable functions
- Message-ID: <TMB.92Aug12171333@arolla.idiap.ch>
- Date: 12 Aug 92 21:13:33 GMT
- References: <1992Aug10.223138.25927@cco.caltech.edu>
- <1992Aug11.111206.25386@cs.tu-berlin.de> <arms.713550511@spedden>
- <1992Aug12.112845.1060@cs.tu-berlin.de>
- Sender: news@ai.mit.edu
- Reply-To: tmb@idiap.ch
- Organization: IDIAP (Institut Dalle Molle d'Intelligence Artificielle
- Perceptive)
- Lines: 25
- In-reply-to: async@opal.cs.tu-berlin.de's message of 12 Aug 92 11:28:45 GMT
-
- In article <1992Aug12.112845.1060@cs.tu-berlin.de> async@opal.cs.tu-berlin.de (Stefan M. Rueger) writes:
-
- >Maybe consumers of NNs should ask for and insist on Lipschitz conditions:
- > |output(a) - output(b)| <= const * | a - b |
- >
- >for all a, b in the domain of the function.
-
- These Lipschitz-functions **are** continuous and thus covered by the
- results of Cybernko, described in my original article.
-
- Unfortunately, imposing a Lipschitz condition is not enough. While a
- neural network might be able to approximate any Lipschitz function,
- the number of training examples (the "sample complexity") you need for
- actually finding a "good approximation" is provably too large in many
- cases.
-
- Proving that a learning method is capable of representing a result is
- only a first step, and not even a decisive one. For example, you may
- be able to do most natural language parsing with small, finite state
- automata, even though they are theoretically not sufficiently to parse
- natural language. Conversely, even if your learning method can
- represent some result, you still need to consider questions of sample
- complexity and algorithmic efficiency.
-
- Thomas.
-