home *** CD-ROM | disk | FTP | other *** search
- Xref: sparky sci.physics:21306 sci.math:17031
- Path: sparky!uunet!olivea!bu.edu!csd!lnd
- From: lnd@csd.bu.edu (Levin)
- Newsgroups: sci.physics,sci.math
- Subject: Algorithmic Information
- Message-ID: <105076@bu.edu>
- Date: 16 Dec 92 17:59:29 GMT
- References: <COLUMBUS.92Dec14142145@strident.think.com> <1992Dec14.223229.22348@galois.mit.edu> <COLUMBUS.92Dec15102022@strident.think.com>
- Sender: news@bu.edu
- Reply-To: Lnd@cs.bu.edu
- Followup-To: sci.physics
- Lines: 23
-
- In article <COLUMBUS.92Dec15102022@strident.think.com>
- columbus@strident.think.com (Michael Weiss) writes:
- > According to Li and Vitanyi, priority for the concept of algorithmic
- > entropy (or information) should go to Solomonoff, with independent
- > rediscovery by Kolmogorov and by Chaitin. Somewhat ironically they use
- > the term Kolmogorov complexity throughout! (I used the term Chaitin
- > entropy in my last post.) Probably the best term is "algorithmic
- > entropy" or "algorithmic information content", which Zurek uses.
-
- Kolmogorov's paper (Jan. 1965) is a few months later than Solomonoff's but both
- of them made talks years earlier (and did not know of each other). Solomonoff
- had crucial insights, but did not introduce the complexity function explicitly.
- The functions he did introduce were ill defined (limits diverged, etc.).
- Both proved the crucial theorem of invariance within +/- O(1).
- Also, each had important ideas not found by the other. So, I think, the
- use of Kolmogorov's name is well justified (as well as Solomonoff's).
-
- Chaitin published a partial (un-invariant, not far from Shannon's)
- result in 1966 and more important things in 1969 and 1975. It is hard to judge
- how independent he was, since he does not cite his predecessors even in those
- cases when he certainly knows about them (his 1987 complexity book, does not
- mention even names of Kolmogorov and Solomonoff; see the review in
- {\em J. of Symbolic Logic} 54(2) pp. 624-627, 1989.)
-