I know, it is an implementation detail, and some people think it is forbidden to be interested into them. But I nevertheless want to find references for, and confirmation of, the following:
I have a C# 4.0 application (single producer/single consumer) which transfers huge amount of data in chunks. Although there\'s no new memory allocation I run out of memory after a while.开发者_Python百
If you application is such that it has to do lot of allocation/de-allocation of large size objects (>85000 Bytes), its eventually will cause memory fragmentation and you application will throw an Out
I\'m having System.OutOfMemory exceptions in my .NET开发者_StackOverflow Windows Service. I\'m not sure what\'s causing it. I suspect fragmentation in the large object heap but i\'m not sure. How can
In my application I need to load large files (can beabout ~ 250 MB) into memory, I\'m doing it in a lazy way - when user ask to see a file - I\'m loading it. After that, every time user tries to acces
Since it is recommended to use the IDisposal pattern for large objects, I am wondering, why there seems to be no reliable way to determine the limit, up from which an object is to be considered \"larg
I have an other active question HERE regarding some hopeless memory issues that possibly involve LOH Fragmentation among possibly other unknowns.
I know that all arrays in .net are limited to 2 GB, under this premise, I try not to allocate more that n = ((2^31) - 1) / 8 doubles in an array. Nevertheless, that number of elements still doesn\'t s