home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.compression
- Path: sparky!uunet!gatech!usenet.ins.cwru.edu!po.CWRU.Edu!daf10
- From: daf10@po.CWRU.Edu (David A. Ferrance)
- Subject: entropy
- Message-ID: <1992Jul22.144104.27576@usenet.ins.cwru.edu>
- Sender: news@usenet.ins.cwru.edu
- Nntp-Posting-Host: cwns5.ins.cwru.edu
- Reply-To: daf10@po.CWRU.Edu (David A. Ferrance)
- Organization: Case Western Reserve University, Cleveland, OH (USA)
- Date: Wed, 22 Jul 92 14:41:04 GMT
- Lines: 22
-
-
- Well, here it is, another question about entropy.
-
- Right now I compute the entropy of a file by summing the probability of
- each character ocurring times the log (base2) of the reciprocal of that
- probability. This gives me the theoretical minimum number of bits
- needed to represent each character, on average (from what I understand).
- What I would like to know is, does this number only work for first order
- (1-ary?) compression techniques or does it serve also as a limit for
- more complex techniques? If it does not serve for the more complex
- techniques (and I suspect it doesn't), is there any way to determine
- entropies for different orders?
-
- Forgive me if I destroy any terminology; this is a new and fun area for
- me.
-
- Dave
- --
- David Ferrance Sigma Nu Delta Alpha 1061 daf10@po.cwru.edu
-
- And on the seventh day, He exited from append mode.
-