home *** CD-ROM | disk | FTP | other *** search
- To: kg@mykonos.rc.rit.edu (Kyriakos Georgiou)
- Subject: Re: JBIG compression
- Cc: sam@cthulhu.engr.sgi.com, tiff@sgi.sgi.com
- Date: Fri, 16 Dec 1994 16:04:19 CST
- From: rick@digibd.com (Rick Richardson)
-
- Kyriakos Georgiou writes...
- >
- > In response to Sam's comments:
- >
- > > Also, can the current libtiff G4 decompression implementation be
- > > improved ? What sort of improvements can be expected ?
- > >
- > > What's wrong with the current implementation?
- >
- > The current implementation is fine, but.. I have in my hands a commercial
- > product (no source) that does decompression in noticable less time.
- > That suggests that there are faster ways to decompress G4 in software,
- > alas my question.
-
- I cannot speak for the G4 decompression in libtiff, but I can vouch for the
- fact that the libtiff G3 decompression is not as fast as at least my own
- implementation. However, I am not at liberty to provide the source code
- since it belongs to my employer.
-
- In my experience with G3, I found the fastest way was to completely unroll
- the decompression. I tried numerous different 'elegant' algorithmic
- approaches over a period of about 3 years and discovered that the simple
- brute force unrolled approach was always fastest. The decoder itself is
- something like 700 lines of nested C language if-else statements.
-
- The algorithm that Peter Deutsch uses in the 3.x versions of
- Ghostscript are similar to the algorithm you seem to be describing.
- My feeling on all this is that the current decoder in the library is
- reasonably portable and plenty fast enough for most people's needs.
- Until it's not, I have no motivation to change it. I'd welcome a
- replacement for it. I'd also welcome some good performance analysis
- of the existing code. I wouldn't stop with the G3/G4 decoder however,
- hit the other ones too! (I'm sure folks would like a faster LZW decoder
- too.)
-
- Sam
-
-
-