home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!stanford.edu!rutgers!ub!acsu.buffalo.edu!ambati
- From: ambati@acsu.buffalo.edu (Balamural K. Ambati)
- Newsgroups: comp.ai.neural-nets
- Subject: Re: Kolmogorov's theorem
- Message-ID: <Btx1us.Kzr@acsu.buffalo.edu>
- Date: 1 Sep 92 20:20:52 GMT
- References: <BtuMKD.14p@acsu.buffalo.edu> <94995@bu.edu>
- Sender: nntp@acsu.buffalo.edu
- Organization: UB
- Lines: 21
- Nntp-Posting-Host: lictor.acsu.buffalo.edu
-
- lnd@cs.bu.edu (Levin) writes:
-
- >In article <BtuMKD.14p@acsu.buffalo.edu> ambati@ (Balamural K. Ambati) writes:
- >>Could someone post some info. on Kolmogorov's theorem on reconstruction
- >>of multidimensional functions and information processing in neural
- >>networks? Also refs. if possible. Thanks.
-
- >Every continuous multivariate real function can be represented as a depth-4
- >composition of continuous one-variate functions and sum. This result of
- >Kolmogorov and his student Arnold solves the 13-th Hilbert problem.
-
- What implications does this have for capacities for neural networks?
-
- Thanks.
-
- Balamurali K. Ambati
-
- ambati@ubunix.buffalo.edu
-
-
- .
-