Site icon Leonid Mamchenkov

Compression discussion

Slashdot had this post recently about data compression contest. Some Wikipedia data was used as a sample for this contest. And, as usual with compression discussions at Slashdot, there were a lot of humorous threads. I liked these three pathes in particular:

  1. Steer away contest requirements from the lossless compression. If succeded, the whole Wikipedia encyclopedia could be compressed into 1 bit.
  2. Use random data generation method (such as /dev/random device) to eventually generate the complete content back.
  3. Use minimum size of compressed data (1 bit), while having an extremely large size of compression executable (the size of uncompressed data)
Exit mobile version