New Super-Compression?

A Duston hald at sound.net
Thu Nov 28 18:44:51 CST 2002


Duane Attaway wrote:
> 
> On Wed, 27 Nov 2002, Scott Long wrote:
> 
> > A friend of mine is a consultant for a local company (who would remain
> > nameless, but I've gotta give you guys a link for this) and they've partnered
> > with a company that supposedly has a new compression algorithm that can
> > dramatically reduce the size of files (for example, from 8MB to less than
> > 100k for an image).  You can find out info about it at:
> >
> > http://www.entrekc.com/products/compression.html
> 
> I love the claim "visually lossless."  This sounds like the claims of one
> company who claimed lossless compression of random data.  The claim stated
> you could put any stream of random data through their compressor and it
> will be reduced by at least one byte.  Send it in again and you would have
> a further reduction.  Repeat until you had a lossless file the size of one
> byte.  Yes sir!  A one gigabyte file compressed down to one byte.  If its
> good enough for the US Patent Office, its good enough for you!  Just don't
> get your single byte compressed files mixed up.

That was WEB Technologies and their DataFiles/16 product.

See http://www.faqs.org/faqs/compression-faq/part1/section-8.html.  Look at 
item 9.2 "The Counting Argument" for an explanation why NO compression 
algorithym can compress ALL files.  The short story is that at least two 
different files would end up compressing to EXACTLY the SAME file.  Then
how do you decompress a file into one of two different files.

Hal




More information about the Kclug mailing list