In article <1992Nov16.232015.15970@coe.montana.edu>, bithead@cs.montana.edu (Bob Wall) writes:
|> In article <1992Nov13.120505.29654@spectrum.xerox.com> landells.sbd-e@rx.xerox.com writes:
|> >I have an application that generates binary output. The output is relatively random, but there are approximately twice as many off bits as on bits. My objective is to compress this as much as possible.
|> >
|> >I have tried several 'standard' compressors, arj 2.2, lharc, pkzip 1.1, and have only managed to achieve very minimal compression in the order of 4% at best (on a 40K file). Now I know that a truly random binary datastream cannot be compressed, but I wa|> s kind of hoping for better than 4%. Am I missing something fundamental, or is this really the best that can be achieved?
|> >
|> >If there is a technique to compress this type of data, I would appreciate some pointers to some source code that implements it.