home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.compression
- Path: sparky!uunet!usc!zaphod.mps.ohio-state.edu!magnus.acs.ohio-state.edu!usenet.ins.cwru.edu!agate!overload.lbl.gov!s1.gov!lip
- From: lip@s1.gov (Loren I. Petrich)
- Subject: Re: sizing huffman trees
- Message-ID: <1992Jul22.003239.15971@s1.gov>
- Sender: usenet@s1.gov
- Nntp-Posting-Host: s1.gov
- Organization: LLNL
- References: <1992Jul21.220205.18507@math.waterloo.edu>
- Date: Wed, 22 Jul 1992 00:32:39 GMT
- Lines: 23
-
- In article <1992Jul21.220205.18507@math.waterloo.edu> bbobak@math.waterloo.edu (Brad Bobak) writes:
- > I was wondering if anyone has come up with an algorithm for determining
- > the size of an encoded huffman tree from just the frequencies of the
- > letters (sorted or unsorted, it doesn't matter).
- > Right now, to size a tree, I'm basically building it, and this is time
- > consuming if I'm using alot of trees.
- > Any pointers, suggestions, etc. would be appreciated.
-
- There are several algorithms for building Huffman trees.
- Perhaps the simplest is the original Huffman one.
-
- Repeat to build all the nodes:
-
- Look through all the unconnected nodes and symbols to find
- the two with the two smallest frequencies.
-
- Connect these to the new node and make the node's frequency
- the sum of the two branches' frequencies.
-
-
- I'm sure that this will be fairly efficient unless one has a
- terribly skewed distribution.
-
-