OutOfMemoryException' was thrown

Jan 14, 2010 at 10:55 PM
Edited Jan 15, 2010 at 5:47 AM

Hi Cheeso,

I saw the other post about the same problem. However, in my case, we are not dealing with that much data....around 6MB total in 129 files. This is happening on Windows Server 2003 (4GB) on a client setup. (I'm running Windows 2003 under VMWARE with less memory with no problems.)  I have not tried setting that ParallelDeflatethreshold parameter to -1 yet as I'm working on an update for him. I have other clients that are not experiencing any problems using the same build.  Any thoughts?  I'm using 1.9.030.

Exception of type 'System.OutOfMemoryException' was thrown.
   --->
     at Ionic.Zlib.DeflateManager.Initialize(ZlibCodec codec, CompressionLevel level, Int32 windowBits, Int32 memLevel, CompressionStrategy strategy)
     at Ionic.Zlib.DeflateManager.Initialize(ZlibCodec codec, CompressionLevel level, Int32 bits, CompressionStrategy compressionStrategy)
     at Ionic.Zlib.ZlibCodec._InternalInitializeDeflate(Boolean wantRfc1950Header)
     at Ionic.Zlib.ZlibCodec.InitializeDeflate(CompressionLevel level, Boolean wantRfc1950Header)
     at Ionic.Zlib.ParallelDeflateOutputStream.WorkItem..ctor(Int32 size, CompressionLevel compressLevel, CompressionStrategy strategy)
     at Ionic.Zlib.ParallelDeflateOutputStream._InitializePoolOfWorkItems()
     at Ionic.Zlib.ParallelDeflateOutputStream.Write(Byte[] buffer, Int32 offset, Int32 count)
     at Ionic.Zlib.CrcCalculatorStream.Write(Byte[] buffer, Int32 offset, Int32 count)
     at Ionic.Zip.ZipEntry._WriteEntryData(Stream s)
     at Ionic.Zip.ZipEntry._EmitOne(Stream outstream)
     at Ionic.Zip.ZipEntry.Write(Stream s)
     at Ionic.Zip.ZipFile.Save()
     at Ionic.Zip.ZipFile.Save(String fileName)
     at Vie.Server.WindowsService.ProcessReports(String propertyNumber)

Coordinator
Jan 15, 2010 at 4:30 PM

Likely the same cause as the other question you saw.

The failure is happenning in the initialization of the parallel deflate mechanism., where all the buffers get allocated.

I use a heuristic to determine the number of parallel threads (and thus, the number of buffers, codecs, etc): 4 * processor count.  The processor count includes each core on a multi-core machine, so that if you have a server with 4 processors, each with 2 cores, then you get (4 * 4 * 2) = 32 parallel threads, each with its own buffer.   This seemed to work for smaller machines; it may not be a useful heuristic on larger ones.

I don't have a server so I cannot test the effectiveness of the heuristic at such ranges.  Maybe there ought to be an upper limit on the number of threads/buffers used for a single process.

For now you can avoid the problem with the ParallelDeflatethreshold = -1, which turns off parallel deflation.  You can also try setting the CodecBufferSize property on the ZipFile - It governs the size of the buffer used for each thread.  It defaults to 128k for the ParallelDeflateOutputStream.  Try setting it to 32k (32736), you'll cut your memory usage to 25%, and still retain the parallel action.

Coordinator
Jan 15, 2010 at 8:46 PM

Thinking about this further... the other situation where someone had an out-of-memory problem, occurred in a server where there were multiple processes, all spinning up multiple threads and buffers to do the zipping.   It wasn't a single process.    The memory consumed by one session is upper bounded by (#cpus * #buffers per cpu * size of each buffer).   As I said above, the #buffers per cpu(core) is 4.  The default buffer size is 128k, but there is an input and output buffer, so double that (256k).  The result for a 4-way dual-core server is, 8x4x256k, which is 8mb, not a ton of memory.  You should really never run out of memory on a server for a single zip session, unless the machine is under memory pressure for other reasons.  

If you run 20 concurrent processes, or 20 concurrent zip sessions in a single process, then you are up to 160mb, which I suppose could get you into memory trouble.  But even there, on a server, it shouldn't be impossible. 

 

Jan 15, 2010 at 9:00 PM

Hi Cheeso,

Thanks for the info.

I don't quite understand it either.  I have, like, twenty "clients" on Windows Server 2003 that are not having problems and they are also zipping up folders with about 129 files (of different sizes...any where from 1K - 14MB or more). But with a couple of clients so far, I get those out of memory exception errors. It could just be that there servers are more loaded down. Anyways, I'm going to turn off parallel deflation for now on their installations and see what happens.

Thanks.

--Lenard

 

Jan 30, 2010 at 9:19 AM

Hi Cheeso

I am also running into this issue.

I have 70 MB file zipped in a zip file. It appears to extract successfully 1 or 2 times and then starts throwing this.

I looked at this thread and started setting CodedbufferSize like you mentioned to 32 but still getting the exception.

Below is some info:

Here are the 2 methods i am using :

public static object AnimsDto
        {
            get
            {
                if (_AnimsDto == null)
                {
                    string data = io.Path.Combine(io.Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().Location), @"Clas.zip");
                    ZipFile zf = ZipFile.Read(data);
                    zf.CodecBufferSize = 32;
                    _AnimsDto = zf;
                }
                return _AnimsDto;
            }

}

 

public static MemoryStream getStreamFor(string entryName)
        {

            ZipEntry ze = (AnimsDto as ZipFile)[entryName];
            MemoryStream ms = new MemoryStream();
            try
            {
                ze.Extract(ms);
            }
            catch (Exception xcp)
            {
                publishMsgStatic(xcp.Message);

            }
            finally
            {
               
            }
         return ms;
        }

 

 

Here is the exception:

 

System.OutOfMemoryException was caught
  Message="Exception of type 'System.OutOfMemoryException' was thrown."
  Source="mscorlib"
  StackTrace:
       at System.IO.MemoryStream.set_Capacity(Int32 value)
       at System.IO.MemoryStream.EnsureCapacity(Int32 value)
       at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count)
       at Ionic.Zip.ZipEntry._ExtractOne(Stream output)
       at Ionic.Zip.ZipEntry.InternalExtract(String baseDir, Stream outstream, String password)
       at Ionic.Zip.ZipEntry.Extract(Stream stream)
       at SaieV\Class1.cs:line 1107
  InnerException:

 

 

Please let me know if you have any suggestions.

Thanks

N

Coordinator
Jan 30, 2010 at 3:09 PM

Hello Lenard, following up on this. . .

Another person reported a memory problem  with the library and gave me some new input.  I investigated and found a leak, and fixed it.  You can see the workitem here:  http://dotnetzip.codeplex.com/WorkItem/View.aspx?WorkItemId=10030 

The fix for this is available in v1.9.1.2 .  You might want to give it a try.

 

Coordinator
Jan 30, 2010 at 3:57 PM

Pathurun, your problem is different.  The stacktrace you provide shows that you are trying to extract an entry from a zip file into a memory buffer. This is similar to the problem Lenard reported only in the top symptom: Out of memory.  But the underlying cause is completely different.

In your case, your machine simply does not have a 70+mb buffer that it can allocate.  The solution for you is, don't try to allocate a buffer that is larger than your machine can allocate. 

Now, you may be thinking that your machine has plenty of memory. Maybe you have 1gb of RAM and there's 400mb that is shown as available in perfmon.   But the "remaining memory" is an aggregate, and it does not give any indication as to the largest chunk of memory available in that pool.  It could be 4 blocks of 100mb each.  More likely it is many blocks of a smaller size.  In your case, it's possible that there is no single block that is sufficient to store your 70mb file.

And it is likely that .NET is trying to allocate much more than 70mb.  People who use the MemoryStream class are often not aware of the implementation of the class, but if it is typical, I'd guess it works this way:  upon initiation (when an application calls the MemoryStream constructor), it allocates a buffer of a default size.  It doesn't matter what the size is, but let's suppose it's 8kb.  The MemoryStream class then accepts Write() calls by storing data in that buffer, up to 8kb.  Suppose your application writes 1kb of data, 8 times.  In this case the memory buffer is full.  Now suppose your application tries to write another 1kb of data into the MemoryStream.  In this case, the MemoryStream will auto-expand itself.  It will - I'm guessing here, but this is typical - double the size of it's buffer, then copy the original buffer into the first half of the new buffer.  Then it frees the original buffer,   and finally writes the new 1kb data block into the new buffer.   the buffer is now 16kb, 9kb of which has been used.  Now, writes continue, and the buffer auto-expands, to 32k, 64k, and so on....1024k (1mb), then 2mb, and so on.... until you reach 64mb.  Now if once again your application exceeds the internal buffer size, MemoryStream  will auto-expand to 128mb.  In fact in order to complete the auto-expansion, MemoryStream must have at least 2 large buffers concurrently:  one of 64mb, and another of 128mb.  so in fact you need 192mb at one time, in two very large, contiguous chunks.

This is why an application can get an out-of-memory condition when using MemoryStream to store large buffers, even if there is "apparently" enough memory to satisfy the request.

In any case you have to solve this problem yourself, with a different design of your app.  Is it really necessary to have all 70mb in memory at one time?  Is there a way for you to hold only a smaller chunk of the data - say 1mb - in memory at any one time?  Etc.

 

Apr 25, 2011 at 6:20 PM

I'm getting the same error (except I'm not using streams) using the current version (1.9.1.5)

System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.   at Ionic.Zlib.DeflateManager.Initialize(ZlibCodec codec, CompressionLevel level, Int32 windowBits, Int32 memLevel, CompressionStrategy strategy) in c:\DotNetZip\v1.9

VB 2010 code is called:

 

Public Function ableToArchive(ByVal filename As String, ByVal ZipFilename As String, ByVal LOBID As Integer, _
    Optional ByVal DeleteRegardlessOfConfig As Boolean = False) As Boolean

    Dim fi As New FileInfo(filename)
    Dim zipMethod As ZipFile = Nothing

    Try

        ' we zip the file if the config file says to zip or delete
        If DeleteFilesAfterLoad Or ZipFilesAfterLoad Then

            If System.IO.File.Exists(ZipFilename) Then
                zipMethod = ZipFile.Read(ZipFilename)
            Else
                zipMethod = New ZipFile()
            End If

            ' the UpdateFile method works even if the entry does not yet exist.
            ' Really it should be called "AddOrUpdateFile"
            zipMethod.UpdateFile(fi.FullName, "")
            zipMethod.Save(ZipFilename)
            zipMethod = Nothing

            Return True
        End If

    Catch ex As Exception
        Dim logEntry As New Logger("ERROR", ex.ToString, fi.FullName, LOBID)
        logEntry = Nothing

        If Not IsNothing(zipMethod) Then
            zipMethod = Nothing
        End If

        Return False

    End Try

End Function

Unfortunately, the out of memory exception does not get trapped, and the program aborts.  
This is running on Windows Server 2003 (which is a SQL server as well) w/ 4 CPUs and 4 GIG RAM. 
Any thoughts?  Thanks!
Apr 25, 2011 at 6:31 PM

Chris,

Try setting the zip.ParallelDeflateThreshold = -1 before calling the UpdateFile method and see what happens.

--Lenard

Coordinator
Aug 6, 2011 at 2:02 PM

Also - v1.9.1.7 (available now) includes some changes that limit its memory use on larger machines - such as those with 4 cpus or more.  It may work better for you without needing to set ParallelDeflateThreshold .