home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!cis.ohio-state.edu!ucbvax!dog.ee.lbl.gov!overload.lbl.gov!s1.gov!lip
- From: lip@s1.gov (Loren I. Petrich)
- Newsgroups: comp.compression
- Subject: Re: 16-1 compression
- Message-ID: <1992Jul28.233030.19035@s1.gov>
- Date: 28 Jul 92 23:30:30 GMT
- References: <1992Jul28.165529.18628@lugb.latrobe.edu.au>
- Sender: usenet@s1.gov
- Organization: LLNL
- Lines: 25
- Nntp-Posting-Host: s1.gov
-
- In article <1992Jul28.165529.18628@lugb.latrobe.edu.au> HCHA8904593X@LUST2.LATROBE.EDU.AU (PERRETT, ANDREW) writes:
- >This has probably been FAQ'd to death but can anybody tell me if the report in
- >Byte of two months ago was correct ie come mob claiming a 16-1 compression
- >ratio, and being able to do it again to the same file ?? Stranger things have
- >happened (hell the earths round!) but i think ill remain agnostic till i get to
- >turn my HD into a couple of gigabytes...
-
-
- In fact, there is something in this newsgroup's FAQ about this
- supposed compression method, called "WEB" (if I remember correctly).
- It supposedly does lossless compression on all kinds of files by a
- factor of 16, and it can supposedly be repeated.
-
- It is easy to show that _lossless_ compression (where one can
- recover the original file without corruption) of any possible file is
- an impossibility. This comes from counting the number of possible
- files of a given length and pointing out that the number of compressed
- files is less than the number of original files. Thus, there will be
- at least one compressed file which will map onto more than one of the
- original files.
-
- For _lossy_ compression (as on sound or image information),
- and also for lossless compression practiced on files with certain
- features (as in text files, with their internal patterns), the story
- is different.
-