This project is read-only.

Corrupt File Zip with no Extension

Nov 9, 2010 at 11:25 PM


I am using the AddDirectory method to add several folders to a zip file. I then save the zip file to the output stream. I have some users complaining of corrupt zip files.

I also get the error when I manually test the app on the server, however, when I debug the code locally I cannot recreate the issue.

I found a discussion thread that was on point (Thread 64325). I am having the same type of issue, but I definitely never "accidentally" created a zip that I am adding to the new zip. After downloading, when using Windows standard unzip program, I will get a corrupt file error. If I use WinRar, I find a file with no file extension. If I rename it and give it a zip extension everything is there.

Any ideas?

            Models.StoreOrder order = (Models.StoreOrder)Session["Order"];
            BL.StoreManager Store = new BL.StoreManager();

            Response.BufferOutput = false; // no buffering - allows large zip files to download as they are zipped 
            string ReadmeText = Store.GetDownloadReadMe(order.ObjectsInOrder) + " " + DateTime.Now.ToString("G");
            string archiveName = String.Format("Wisc-Online-{0}.zip", DateTime.Now.ToString("yyyy-MMM-dd-HHmmss"));
            Response.ContentType = "application/zip";
            Response.AddHeader("content-disposition", "attachment; filename=" + archiveName);

            using (ZipFile zip = new ZipFile())
                zip.AddEntry("Readme.txt", ReadmeText);

                foreach (Models.Object o in order.ObjectsInOrder)
                    System.IO.DirectoryInfo dir = new System.IO.DirectoryInfo(Server.MapPath("/Objects/" + o.ID));
                    zip.AddDirectory(dir.FullName, dir.Name);


            Session["Order"] = null;
        catch (Exception ex)
            BL.SiteManager sm = new BL.SiteManager();
            sm.SendEmailToWebmaster(BL.SharedConstants.WEBMASTEREMAIL, ex.StackTrace.ToString() + ex.Message, "Download exception");

Nov 10, 2010 at 2:17 AM
Edited Nov 10, 2010 at 2:18 AM


Try this:  remove the call to HttpContext.Current.ApplicationInstance.CompleteRequest(), and replace it with Response.Close().

There was a time when a sample shipped with that CompleteRequest in it, but that is wrong. I have since discovered that using it can result in corrupt zip files. You should use Response.Close() instead.


Nov 10, 2010 at 2:36 AM
Edited Nov 10, 2010 at 2:33 PM

Thank you for replying. I use this library in a few apps and I really like working with it.  In the other apps, I am zipping on the fly client side and uploading to the server. Works like a charm.


Response.Close() resulted in a definite corrupt zip. This time around even WinRar throws an error when unzipping - unexpected end of archive.  Could this be caused by the Response ending abrupt;y before the process is finished? It all happens on the same thread so I am not sure if that is possible. Perhaps I shouldn't make assumptions there. Is the process asynch?


C:\Users\stulo.FVTC\Desktop\ Unexpected end of archive!  
C:\Users\stulo.FVTC\Desktop\ CRC failed in Wisc-Online-2010-Nov-09-202621. The file is corrupt



Nov 10, 2010 at 5:28 PM

Hmm, that's interesting.  Reliable corruption using Response.Close(). 

No, the Save process won't be asynchronous.  When Save() returns, it's done.

You could try Response.End() in place of Response.Close().  You could also try Response.Flush before Response.Close().

I don't know what else to suggest - try a few of these combinations and see if you can get it working. let me know.


Nov 10, 2010 at 7:32 PM

After trying every possible combination of ending the response with or without flushing I ended up with variations of corruption.

After a little digging, I found out that Response.Flush is only necessary if not buffering the response. After a bit more research, it occurred to me to setting BufferOutput to false may have been the cause of the issue. I commented out that line and the files downloaded intact. So as I understand it, now the user will have to wait for the entire zip to finish before the download begins. I can live with that.

My theory is that somehow allowing the download to start as the zip was processed, caused the error. It know for sure now that it is all the same thread but... possibly a race condition?


Nov 11, 2010 at 12:24 PM
Edited Nov 11, 2010 at 12:30 PM

I couldn't judge the cause unless I was able to see the zip files produced in each case. or reproduce it here.  Your understanding is correct - buffering usually means the entire response is buffered before IIS begins to transmit the first byte to the requester. For a large zip this can be a large problem.  For smaller zips, no problem.  The other side effect is, of course, memory usage on the server. If you have many concurrent requests, they will ALL buffer their results before sending. This can cause memory usage to spike.  Test to be sure.  

Regarding the differences in generating zips with buffering and without....Zips written to non-seekable streams use a slightly different format than those written to seekable streams. When you turn buffering off, Response.OutputStream becomes non-seekable, and DotNetZip uses the slightly different metadata format (described in some other locations as "bit 3 encoding" - check the zip spec for what this means). Some zip utilities do not properly read bit-3 encoded zip files.  One notable example is, I think, the built-in zip handling in MacOs.  It will classify such a zip file as corrupted, though other tools can read and extract the zip just fine. WinZip has never exhibited a problem handling bit-3 zip files, as far as I know.  The DotNetZip tools and library can read zips with or without bit-3 encoding.

if you're happy then I guess it's no problem.   It's troubling to me that turning off buffering gives you corruption, and I'll want to investigate that for myself. But if you're satisfied then I guess this particular issue is closed.

thanks for the report.


Nov 11, 2010 at 12:25 PM
This discussion has been copied to a work item. Click here to go to the work item and continue the discussion.