home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.compression
- Path: sparky!uunet!cs.utexas.edu!uwm.edu!csd4.csd.uwm.edu!markh
- From: markh@csd4.csd.uwm.edu (Mark)
- Subject: Re: Compressing English text to 1.75bits or better (80%)
- Message-ID: <1992Sep12.154713.14396@uwm.edu>
- Keywords: FAQ
- Sender: news@uwm.edu (USENET News System)
- Organization: Computing Services Division, University of Wisconsin - Milwaukee
- References: <1992Sep12.103552.24873@rhrk.uni-kl.de>
- Date: Sat, 12 Sep 1992 15:47:13 GMT
- Lines: 10
-
- In article <1992Sep12.103552.24873@rhrk.uni-kl.de> marpia@sun.rhrk.uni-kl.de (David Powers [Informatik]) writes:
- >Perhaps this information should go into the FAQ question [73] on
- >the theoretical limits of compression.
-
- I have in mind an compression algorithm. It has a list of every single
- text that was produced anywhere, is being produced, or ever will be
- produced, and indexes each with a unique 64 bit number.
-
- This algorithm will compress every English text to 64 bits. Admittedly, it's
- software implementation could be a memory and CPU hog, but still...
-