This project is read-only.

Problem - running "Too Fast" in 64 bit mode

Sep 28, 2010 at 10:05 PM
Edited Sep 28, 2010 at 10:08 PM

I have a C# .Net application that reads multiple files in a folder and one by one Extracts the Date from the filename and .Zips it up into a Daily file.  Immediately after the addfile and Save, I run a CRC Check and then Delete the original input file.   If the daily .Zip file already exists, I list the files inside the .Zip to see if the input file name is already in it.  If that exists, I make a copy of the .Zip with Tmp_ in front of it, then do a File.Remove from the original and then proceed with the AddFile, CRC Check, delete input file.  I ran it on a folder that had about 400 files that ended up in about 45 daily .Zip files.      I kept getting "Another process has control of the file" messages in several different places throughout the process.    (Nothing else was running that would touch these files, they are on a local machine, so the only explanation is that the process is running so fast that it has not had time to "release" the file before it is trying to do something else.

When I re-compiled it down to 32 bit mode all the issues went away Immediately.    is this a known issue?   and if so, is there a "best practices" way around it?    

I though of putting a seperate Try/Catch around every step that would have it wait for a few milliseconds before trying again but that seems a pretty Sloppy way of doing things.





Sep 29, 2010 at 5:05 AM
Seems like you might want to use an asynch delete strategy to avoid the OS locking issue. Add the files to a queue of to-be-deleted files. Use one or more worker threads to delete files, and only remove them from the queue if no exception occurs during the delete operation. maybe add a back-off delay in there for files that cannot be deleted, and of course you'll need to give up after so many retries. You could set this to run at the exit of your application, or during idle time, etc. If you don't like that idea or think it's too com plicated, maybe research the problem of locked files. Oh, I forgot to ask- you are employing the using() clause, right? Failure to do so will result in file locking problems.
Sep 29, 2010 at 5:55 AM

Thank you for the reply.


Yes, I am  employing the Using().   Also, it is not only "locking" during the delete.  I am getting them on the .Save and sometimes on the .AddFile.   Sorry that I didn't explain that well enough, when I wrote in "several places" that is what I meant.   I will give better detail on any questions in the future.



Sep 30, 2010 at 2:27 PM
Ok, can you provide a smLl test case that reliably reproduces the problem you described?
Oct 1, 2010 at 2:14 PM

Thanks for the reply,


I can setup a test program, to duplicate the problem you will need a couple of hundred files that are "to be Zipped" though.   I can write a .bat file that will copy a .txt file  a couple of hundred times to set it up.   one thing though, I am unfamiliar with the term "smLl" ??   

I had read somewhere that the latest version does Multi-threading and wondered if the problem was that one thread was working on a particular file while the next one takes off.  does the multi-threading get turned off when compiled in 32 bit mode??   because as soon as I changed it to "x86" cpu on the build the problem went away.