home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.compression
- Path: sparky!uunet!europa.asd.contel.com!darwin.sura.net!mips!sdd.hp.com!usc!cs.utexas.edu!torn!watserv1!watmath!bbobak
- From: bbobak@math.waterloo.edu (Brad Bobak)
- Subject: Re: sizing huffman trees
- Message-ID: <1992Jul22.142705.26970@math.waterloo.edu>
- Organization: University of Waterloo
- References: <1992Jul21.220205.18507@math.waterloo.edu> <1992Jul22.003239.15971@s1.gov>
- Date: Wed, 22 Jul 1992 14:27:05 GMT
- Lines: 30
-
- In article <1992Jul22.003239.15971@s1.gov>, lip@s1.gov (Loren I. Petrich) writes:
- > In article <1992Jul21.220205.18507@math.waterloo.edu> bbobak@math.waterloo.edu (Brad Bobak) writes:
- > > I was wondering if anyone has come up with an algorithm for determining
- > > the size of an encoded huffman tree from just the frequencies of the
- > > letters (sorted or unsorted, it doesn't matter).
- > > Right now, to size a tree, I'm basically building it, and this is time
- > > consuming if I'm using alot of trees.
- > > Any pointers, suggestions, etc. would be appreciated.
- >
- > There are several algorithms for building Huffman trees.
- > Perhaps the simplest is the original Huffman one.
- >
- > Repeat to build all the nodes:
- >
- > Look through all the unconnected nodes and symbols to find
- > the two with the two smallest frequencies.
- >
- > Connect these to the new node and make the node's frequency
- > the sum of the two branches' frequencies.
- >
- >
- > I'm sure that this will be fairly efficient unless one has a
- > terribly skewed distribution.
-
- Yes, I'm aware of this, and have an efficient algorithm for building
- trees. What I want to be able to do, tho, is to find out how big an
- output code a huffman tree will generate from just the frequencies.
- Since I'm expecting to calculate the size alot (> 500 times), I can't
- afford to psuedo-build the tree each time, just to find the output size.
-
-