home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.compression
- Path: sparky!uunet!cs.utexas.edu!zaphod.mps.ohio-state.edu!rpi!sarah!newserve!bingsuns!kym
- From: kym@bingsuns.cc.binghamton.edu (R. Kym Horsell)
- Subject: Re: more entropy
- Message-ID: <1992Jul24.195733.3757@newserve.cc.binghamton.edu>
- Sender: usenet@newserve.cc.binghamton.edu (Mr News)
- Nntp-Posting-Host: bingsuns.cc.binghamton.edu
- Organization: State University of New York at Binghamton
- References: <1992Jul23.174740.14559@usenet.ins.cwru.edu> <1992Jul24.003709.21603@s1.gov>
- Date: Fri, 24 Jul 1992 19:57:33 GMT
- Lines: 17
-
- In article <1992Jul24.003709.21603@s1.gov> lip@s1.gov (Loren I. Petrich) writes:
- >In article <1992Jul23.174740.14559@usenet.ins.cwru.edu> daf10@po.CWRU.Edu (David A. Ferrance) writes:
- >
- >>If I have an unsigned int count[256][256], what is wrong with
- >>calculating entropy like this:
- >
- >>for (i=0;i<256;i++) for (j=0;j<256;j++) {
- >> freq = count[i][j] / total;
- >> ent += freq * log10(1/freq) / 0.30103;
- >> }
- > Yes, some versions of C do have a "log2" function (logarithm
- >to base two).
-
- But the choice of log10 is somewhat ``unusual'' given M_LN2 is
- typically in math.h.
-
- -kym
-