Slashdot had this post recently about data compression contest. Some Wikipedia data was used as a sample for this contest. And, as usual with compression discussions at Slashdot, there were a lot of humorous threads. I liked these three pathes in particular:
- Steer away contest requirements from the lossless compression. If succeded, the whole Wikipedia encyclopedia could be compressed into 1 bit.
- Use random data generation method (such as /dev/random device) to eventually generate the complete content back.
- Use minimum size of compressed data (1 bit), while having an extremely large size of compression executable (the size of uncompressed data)