开发者

C# file handles released too late when opening files on a remote host (via symlink)

My program basically looks like this:

using(XmlReader reader = XmlReader.Create(File.Open(path, FileMode.Open, FileAccess.Reader, FileShare.Read))
{
    //a lot of reader.Read() to read through the file
}

The program reads a large amount of small files this way (typically 5k per second) and on local machines this works properly. However I'm now trying to feed it input data from a remote machine. I've created an NTFS symlinked folder to the remote host on my local drive. So when my machine accesses C:\MyFolder it transparently accesses \NetworkComputer\MySharedFolder. So far so good!

Howevermy program crashes with a "The system cannot open the file." IOException. This seems to map to "ERROR_TOO_MANY_OPEN_FILES". As far as I know I only have one file open at a given time, so the exception doesn't make sense.

After some more debugging I found out the following (using the process explorer).

When running locally the process starts with 400 handles (probably because of .NET) then when I press the start button the number of handles increases to 600 quickly, then it decreases to 400, then rises to 600, etc... This is probably the Garbage Collector kicking in, keeping the number of file handles down. When the process is done processing the files the number of file handles returns nicely to 400.

When reading files from the remote machine I see in the local process monitor the exact same behavior. However in the process monitor on the remote machine I see the number of file handles rise with about 1800, never decreasing, until the program crashes. The file handle number increase on the remote machine directly corresponds with the number of processed files.

TLDR: So this is my question: How can I avoid having dangling file handles on a remote machine when accessing it trough a symlink?

Edit:

Sometimes another exception shows up (IOException) "Not enough server storage is available to process this command". I'm not sure how to counter this, but I found some extra info, still not sure what this means http://support.microsoft.com/kb/285089 http://support.microsoft.com/kb/106167

Edit2:

One solution that seems to help is to slow down reading from the remote machine, by throttling to 2000 files per minute resolves the problem, but this is not rea开发者_Python百科lly a working solution since then the job takes about 10x longer then it has to take.

I've also found some articles that indicate that increasing IRPSstacksize could do the trick, however since the computer is not directly under my control this might not be possible.


Does XmlReader automatically close/dispose the passed stream? (ie, The FileStream returned from File.Open.)

I don't think it does, which would explain why you're seeing the number of handles climbing high and then dropping on your local machine. The handles aren't being released until the finalisers are triggered by the GC, rather than when each stream is disposed.

Try this and see if this makes any difference:

using (var fs = File.Open(path, FileMode.Open, FileAccess.Reader, FileShare.Read))
using (var reader = XmlReader.Create(fs))
{
    // a lot of reader.Read() to read through the file
}


Sounds like finalizer releases the inner stream instead of you dispose it explicitly. Try this:

    using(FileStream steram = File.Open(path, FileMode.Open, FileAccess.Reader, FileShare.Read))
    using (XmlReader reader = XmlReader.Create(steram))
    {
        //a lot of reader.Read() to read through the file
    }


I am not sure whether this behaviour is due accessing a remote machine or due to accessing via symlink...

Perhaps you can just check the behaviour when accessing the remote machine without a symlink - through UNC path and/or drive letter (smb mount)...

I can't be a 100% sure but I think the symlink is part of the problem...

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜