"Similarly, if you start with a 10 kilobyte text file, and 7zip compresses it down to 2 kilobytes, no amount of time spent trying to further compress the file using other compression programs will ever get that file down to 1 byte." (Emphasis added.)
This claim seems too strong. Do you mean that an arbitrary compression algorithm is unlikely to compress the file to one byte?
Taking what's written at face value, if I assume that every compression I apply monotonically decreases the size of the file, then I need only find a large enough sequence of compressions and compose them to compress my file to one byte. This is of course silly, since the composition is just a huge highly specialized compression program optimized for this instance, but I can spend enough time to make it happen.
Comments
Eliezer Yudkowsky
No individual compressor can monotonically decrease file sizes.
And we count the string of .rar.7z.zip.rar… in your filename into the total size of your file.
Kaya Fallenstein
I only assumed the sequence was monotonic.
The second comment is fair. I think when I first read this, I ignored the bit about 7zip and understood the sentence as claiming "no Turing machine takes an input smaller than one byte and prints out a 10kb file".