home *** CD-ROM | disk | FTP | other *** search
- Xref: sparky sci.physics:21287 sci.math:17022
- Path: sparky!uunet!pipex!bnr.co.uk!uknet!ieunet!tcdcs!maths.tcd.ie!tim
- From: tim@maths.tcd.ie (Timothy Murphy)
- Newsgroups: sci.physics,sci.math
- Subject: Re: Chaitin's Omega and QM (was: Bayes' theorem and QM)
- Message-ID: <1992Dec16.025906.21295@maths.tcd.ie>
- Date: 16 Dec 92 02:59:06 GMT
- References: <1gh518INN9eo@chnews.intel.com> <SRCTRAN.92Dec14122532@world.std.com> <COLUMBUS.92Dec14142145@strident.think.com> <1992Dec14.223229.22348@galois.mit.edu>
- Organization: Dept. of Maths, Trinity College, Dublin, Ireland.
- Lines: 55
-
- jbaez@riesz.mit.edu (John C. Baez) writes:
-
- >I salute Michael Weiss' post introducing people to the literature on
- >the number Omega, algorithmic information theory, algorithmic
- >randomness theory etc.. This stuff is seriously cool and should be
- >known by all those who love beautiful ideas. Here is a nice
- >bibliography of stuff along these lines that was passed on to me by Dave
- >DeMers. Unfortunately is doesn't have a reference to Martin-Loef, who
- >(along with Kolmogorov, Solomonoff, Chaitin and others) deserves a lot
- >of credit.
-
- Thank you very much for your bibliography,
- which I know I shall find very useful.
-
- It seems to me that the central concept of Kolmogorov/Chaitin theory
- is _entropy_.
- As I see it,
- there have been 4 phases in the historical development of this concept --
- or perhaps one should say that the same term has been applied
- to 4 related concepts:
-
- 1. Thermodynamics
- Increase of entropy (S) as the measure of irreversibility.
-
- 2. Statistical mechanics
- S = -\sum p\log p
-
- 3. Shannon's information theory
- Entropy as a measure of information.
-
- 4. Chatin/Kolmogorov algorithmic information theory
- Entropy in the language of Turing machines.
-
- But does entropy in the 4th sense
- completely dispace Shannon's definition?
- Can everything in Shannon theory
- be expressed in algorithmic terms?
-
- Equally, one might have asked if statistical mechanics
- could be explained in terms of information?
- Are the 2 kinds of entropy really the same?
- Or is this simply to confuse 2 quantities
- because they are defined by similar formulae?
-
- I'd be interested to know the world's views on this!
-
-
-
-
-
- --
- Timothy Murphy
- e-mail: tim@maths.tcd.ie
- tel: +353-1-2842366
- s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland
-