ZLib Exception when reading large chunks

Jan 18, 2011 at 4:45 AM

I've been trying to figure out for hours why the Ionic.Zlib.ZlibStream wasn't decompressing data properly and instead throwing an exception with "Bad state (incorrect data check)"

In all my tests it worked but when using the stream in a real environment it would blow up in my face. As it turns out, in my tests I read the compressed stream in chunks of 4096 bytes which is fine, and in a real situation it was requesting chunks of 256K.

After a bit of fiddling it seems that trying to read out of a compressed stream in chunks any larger than 5095 bytes results in the above error. So for example:


byte[] buffer = new byte[5100];
int bytesRead = 0;
System.IO.FileStream fs = new System.IO.FileStream("D:\\Documents\\compressed.zlib", System.IO.FileMode.Open, System.IO.FileAccess.ReadWrite);
System.IO.FileStream out_file = new System.IO.FileStream("D:\\Documents\\uncompressed.bin", System.IO.FileMode.Create, System.IO.FileAccess.ReadWrite);
Ionic.Zlib.ZlibStream compress = new Ionic.Zlib.ZlibStream(fs, Ionic.Zlib.CompressionMode.Decompress, false);

  bytesRead = compress.Read(buffer, 0, 5096);
  out_file.Write(buffer, 0, bytesRead);
} while (bytesRead > 0);


That would cause a crash, but if you change the Read call to get the next 5095 bytes your OK.



Jan 18, 2011 at 11:08 AM


I think you've hit the the same bug as here: http://dotnetzip.codeplex.com/workitem/10562.

There's a problem with the Adler32 checksum calculator which means it calculates the wrong value if a single block of data causes an internal int32 to overflow. The number of bytes required to cause this depends on the content of the data - it's 3980 bytes of 0xFF's, or larger chunks if the data contains smaller values (as is probably the case in your example). To be on the safe side with the current version you should use a buffer <3980, or you can create a private build which contains a fix (attached to the link above) which resolves the problem.

Note also that if this bug is triggered during compression DotNetZip will stamp the zip with an incorrect Adler32 checksum. You might not notice this if you only ever try to decompress it with the same version of DotNetZip since it will calculate the same incorrect (but matching) checksum. However, other zip clients will complain in various ways and may refuse to unpack it since they'll think it's become corrupted.

Hope this helps,


Jan 18, 2011 at 12:42 PM

Ah I see, I better shrink the buffers a bit to be on the safe side then.

I did come across that last night but seem to have dismissed it for some reason, my only excuse is that I was quite tired.



Jun 20, 2011 at 6:11 AM

Don't want to raise the dead, but this problem occurs with small buffers as well. I'm compressing and then extracting a 6MB file in chunks of 1024 bytes.

Jun 21, 2011 at 8:13 AM

yes,I have this problem even when i use 1024 or any number for buffer size!!!!!,just when is use 1 for buffer size it works properly,what should i do

the problem occures when im unziping large file

Jun 22, 2011 at 2:47 AM

Yes, There was a basic problem in the Adler checksum, which affected the ZlibStream.  It has been fixed in the latest source tree.

I'm working on releasing a "bug fix" release, v1.9.1.6, which will correct this problem.

Should be available "real soon now".