开发者

.NET Application Memory usage grow over time, memory leak or not?

I wrote a .NET application where memory usage is growing over time. As a test, I set it up to download 800 * 5K (average) size files and doing XML transform and using DTComcontroller to generate manifest, inside 1 hour with an interval of 15 Minutes. During a period of about 5 hours, memory usage grows from 35M to about 70M.

Not sure if this is normal or not. I'm already forcing it by doing

GC.collection();
GC.WaitForPendingFinalizers();

at end of each working circle just to make sure the memory are released in a timely manner.

I already make sure close filehandle, clear stream or other resources after used it. And some objects are just using local references inside a function and I assume they will be discarded at the end of local function and go away with GC.collction().

Also Memory Profiler (ANTS) shows that there are no objects with my application's namespace left at the end of each of the working circle, and no new objects with source code available left as well. The new instances of objects created during each working circle are basically called RuntimeModule, RuntimeAssembly, and DeviceContext. Doesn't looks like I could do anything about it?

This is basically my first application which are suppose to run 24/7 and I really don't know much about memeory leak. More likely than not, there are still problems with memory usage in my application. I just don't know where to look at this stage.

Also, I don't know it's related or not, I am using .net Trace/debug for keep a log. Guess it worth trying to turn off the log to see the memory usage grow.

Update:

.NET Application Memory usage grow over time, memory leak or not?

Dose it look like circle reference of sslstate and sslstream and also other Classes in the class Reference Explorer. Also not sure if that is a circle reference in ANTS generated graph, that means there is actually circle reference in my code somewhere?

I did a few changes by reusing the same object for XMLtransform and Manifest creation, and manually set each Object to null after finish used them in the end of working iteration. Now the memory are greatly reduced, still increase private memory usage as it is working, but much slowly.

I think for my particular case--create lots of objects inside a working circle, and every data is kept in a working thread. If I didn't expl开发者_JS百科icitly release object references, let .NET GC doing that will lead to lots of memory used by the application as it is keep working most of the time, and GC may never get a chance to probably doing house keeping unless the application been manually closed, I have to do the object release much more manually.

@WeNeedAnswers:

            XmlDocument doc = new XmlDocument();
            XmlTextReader textReader = new XmlTextReader(dataFile);
            textReader.Read();
            doc.Load(textReader);


Task Manager shows you the amount of memory belonging to the application that happens to be paged into real memory at the time (the working set). This is probably what you're seeing.

If you look instead at the "Private Bytes" used by your process, this should give you a better indication of the amount of memory used. This value doesn't change when the process working set is trimmed. There is more information in this Microsoft KB article.

You can manually reduce your app's working set by using the Win32 API call SetProcessWorkingSetSize(GetCurrentProcess(), -1, -1).

This is what Windows will do anyway when the system runs low on memory, but controlling when this happens yourself lets you strip your .NET application's working set to its minimum size for investigation purposes.

BTW, be very careful about calling GC.Collect().


The idiomatic way to deal with disposing of objects is the IDisposable interface. That is what allows you to use the using construct to ensure that objects are freed. @Joe is right; in most situations, you should not call the garbage collector directly.


I wrote a .NET application where memory usage is growing over time. As a test, I set it up to download 800 * 5K (average) size files and doing XML transform and using DTComcontroller to generate manifest, inside 1 hour with an interval of 15 Minutes. During a period of about 5 hours, memory usage grows from 35M to about 70M.

Is that a problem? If not (i.e. enough memory) that may be the reason. The GC only works when memory starts getting scarce. 70mb is also very low in general IMHO.


Nah not a leak, its a feature. Don't do the GC.Collect that is nasty. I would only worry about it being a leak if my test box was falling over because of lack of resources, not because the system was using more memory than I expected. The garbage collector doesn't run until it really has too. I don't think that the pressure your exerting on it is excessive enough for it to worry about it.

If your really concerned, push a big pile of data into your system at faster intervals and see what happens. If it can handle an extreme case in burst mode, then usually when you move over to the slow trickle it should handle it a bit more gracefully. Remember that its expensive to call the GC.Collect and so the Framework calls it when the algorithms decide that its necessary.

Are You enumerating the data in via a stream or just bunging it into memory. If the former than in my experience you get a consistant memory footprint. If the latter, depending on the size of the xml it may make it keel over.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜