home *** CD-ROM | disk | FTP | other *** search
- Xref: sparky comp.compression:2988 comp.theory:1744
- Newsgroups: comp.compression,comp.theory
- Path: sparky!uunet!mcsun!ieunet!tcdcs!maths.tcd.ie!tim
- From: tim@maths.tcd.ie (Timothy Murphy)
- Subject: Re: Kolmogorov Complexity
- Message-ID: <1992Aug12.115317.13558@maths.tcd.ie>
- Organization: Dept. of Maths, Trinity College, Dublin, Ireland.
- References: <1992Aug11.164651.1328@kronos.arc.nasa.gov>
- Date: Wed, 12 Aug 1992 11:53:17 GMT
- Lines: 30
-
- wray@kronos.arc.nasa.gov (Wray Buntine) writes:
-
- >Kolmogorov Complexity is one of a family of methods for handling the
- >induction problem. As pointed out, a Turing machine can be used as the
- >reference system because all others will only differ by a
- >constant. The main point to realise is that Kolmogorov Complexity stuff
- >is asymptotic, i.e. it holds as your sample sizes for learning
- >approach infinity. My learning problems unfortunately have
- >finite sample sizes, so I need to choose a reference for which I think
- >the constant in question will be small.
-
- Does this make sense?
- You imply that there is some optimal choice of universal machine.
- But the constant is for one machine relative to another.
- In other words, one machine may do well with one set of strings,
- and another machine with a different set.
-
- At the same time, it is conceivable that there is some absolute criterion.
- I wonder if anything is known about this?
-
- Incidentally, it is not quite the same to say
- that something is defined asymptotically,
- and that it is defined up to a constant.
- The latter is stronger.
-
- --
- Timothy Murphy
- e-mail: tim@maths.tcd.ie
- tel: +353-1-2842366
- s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland
-