Jan 3, 2013 at 3:34 PM
Edited Jan 7, 2013 at 4:22 PM
I am attempting to use DotNetZip to zip about 120,000 files with a total size of approx 300 GB (uncompressed). Most of the files are very small (less than 5k). The naive approach (below) works, but takes an extremely long time.
Is attempting to compress everything into one zip file a losing proposition? Do you have any pointers when using DotNetZip with this number/size of files?
using (var zip = new ZipFile("output.zip"))
zip.UseZip64WhenSaving = Zip64Option.Always;
I assume that there is a temp file being generated somewhere.
I have configured the temp folder so that it is on a different spindle, which makes some difference. I am in the process of trying different combinations of file size (via segmented files) and compression ratio. However, it still takes an extremely long
time, so any advice on how to minimize the compression time are welcome.