开发者

Can memory be cleaned up?

I am working in Delphi 5 (with FastMM installed) on a Win32 project, and have recently been trying to drastically reduce the memory usage in this application. So far, I have cut the usage nearly in half, but noticed something when working on a separate task. When I minimized the application, the memory usage shrunk from 45 megs down to 1 meg, which I attributed to it paging out to disk. When I restored it and restarted working, the memory went up only to 15 megs. As I continued working, the memory usage slowly went up again, and a minimize and restore flushed it back down to 15 megs. So to my thinking, when my code tells the system to release the memory, it is still being 开发者_如何学Cheld on to according to Windows, and the actual garbage collection doesn't kick in until a lot later.

Can anyone confirm/deny this sort of behavior? Is it possible to get the memory cleaned up programatically? If I keep using the program without doing this manual flush, I get an out of memory error after a while, and would like to eliminate that. Thanks.

Edit: I found an article on about.com that gives a lot of this as well as some links and data for other areas of memory management.


Task Manager doesn't show the total that the application has allocated from Windows. What it shows (by default) is the working set. The working set is a concept that's designed to try and minimize page file thrashing in memory-constrained conditions. It's basically all the pages in memory that the application touches on a regular basis, so to keep this application running with decent responsiveness, the OS will endeavour to keep the working set in physical memory.

On the theory that the user does not care much about the responsiveness of minimized applications, the OS trims their working set. This means that, under physical memory pressure, pages of virtual memory owned by that process are more likely to be paged out to disk (to the page file) to make room.

Most modern systems don't have paging issues for most applications for most of the time. A severely page-thrashing machine can be almost indistinguishable from a crashed machine, with many seconds or even minutes elapsing before applications respond to user input.

So the behaviour that you are seeing is Windows trimming the working set on minimization, and then increasing it back up over time as the application, restored, touches more and more pages. It's nothing like garbage collection.

If you're interested in memory usage by an application under Windows, there is no single most important number, but rather a range of relevant numbers:

  • Virtual size - this is the total amount of address space reserved by the application. Address space (i.e. what pointers point to) may be unreserved, reserved, or committed. Unreserved memory may be allocated in the future, either by a memory manager, or by loading DLLs (the DLLs have to go somewhere in memory), etc.

  • Private working set - this is the pages that are private to this application (i.e. are not shared across multiple running applications, such that a change to one is seen by all), and are part of the working set (i.e. are touched frequently by the app).

  • Shareable working set - this is the pages in the working set that are shareable, but may or may not actually be shared. For example, DLLs or packages (BPLs) may be loaded into the application's memory space. The code for these DLLs could potentially be shared across multiple processes, but if the DLL is loaded only once into a single application, then it is not actually shared. If the DLL is highly specific to this application, it is functionally equivalent to private working set.

  • Shared working set - this is the pages from the working set that are actually shared. One could image attributing the "cost" of these pages for any one application as the amount shared divided by the number of applications sharing the page.

  • Private bytes - this is the pages from the virtual address space which are committed by this application, and that aren't shared (or shareable) between applications. Pretty much every memory allocation by an application's memory manager ends up in this pool. Only pages that get used with some frequency need become part of the working set, so this number is usually larger than the private working set. A steadily increasing private bytes count indicates either a memory leak or a long-running algorithm with large space requirements.

These numbers don't represent disjoint sets. They are different ways of summarizing the states of different kinds of pages. For example, working set = private working set + shareable working set.

Which one of these numbers is most important depends on what you are constrained by. If you were trying to do I/O using memory mapped files, the virtual size will limit how much memory you can devote to the mapping. If you are in a physical-memory constrained environment, you want to minimize the working set. If you have many different instances of your application running simultaneously, you want to minimize private bytes and maximize shared bytes. If you are producing a bunch of different DLLs and BPLs, you want to be sure that they are actually shared, by making sure their load addresses don't cause them to clash and prevent sharing.

About SetProcessWorkingSetSize:

Windows usually handles the working set automatically, depending on memory pressure. The working set does not determine whether or not you're going to hit an out of memory (OOM) error. The working set used to make decisions about paging, i.e. what to keep in memory and what to leave on disk (in the case of DLLs) or page out to disk (other committed memory). It won't have any effect unless there is more virtual memory allocated than physical memory in the system.

As to its effects: if the lower bound is set high, it means the process will be hostile to other applications, and try to hog memory, in situations of physical memory pressure. This is one of the reasons why it requires a security right, PROCESS_SET_QUOTA.

If the upper bound is set low, it means that Windows won't try hard to keep pages in physical memory for this application, and that Windows may page most of it out to disk when physical memory pressure gets high.

In most situations, you don't want to change the working set details. Usually it's best to let the OS handle it. It won't prevent OOM situations. Those are usually caused by address space exhaustion, because the memory manager couldn't commit any more memory; or in systems with insufficient page file space to back committed virtual memory, when space in the page file runs out.


This is what we use in DSiWin32:

procedure DSiTrimWorkingSet;
var
  hProcess: THandle;
begin
  hProcess := OpenProcess(PROCESS_SET_QUOTA, false, GetCurrentProcessId);
  try
    SetProcessWorkingSetSize(hProcess, $FFFFFFFF, $FFFFFFFF);
  finally CloseHandle(hProcess); end;
end; { DSiTrimWorkingSet }


Let's get this straight: FastMM4 does not leak memory, your code might.

To know for sure, execute this instruction somewhere in your application (where FastMM4 is in the uses clause and $define ManualLeakReportingControl is set, in FastMM4Options.inc for instance):

ReportMemoryLeaksOnShutdown := True;

FastMM4 will then report at the end if you forgot to free some memory.

If you wish to know a bit more, you can watch this video from CodeRage 2: Fighting Memory Leaks for Dummies


After learning from Barry Kelly's excellent answer, try analysing your process using VMMap from Sysinternals, which can be found here. This analyses the memory usage of a single process in more detail even than Process Explorer: "VMMap is the ideal tool for developers wanting to understand and optimize their application's memory resource usage." It has a useful helpfile, too.


Task Manager doesn't show what your program is actually using. It shows the total that the memory manager has allocated from Windows. When you free an object or otherwise deallocate dynamically allocated memory, it gets returned to the memory manager (FastMM) immediately. Whether or not that's passed back to Windows is another matter. The memory manager likes to keep some extra memory sitting around so it doesn't need to grab more from the OS every time you need to create a new object. (This is a good thing, and you don't want to change it.)

If your program's memory usage is continually increasing, instead of hitting a steady state at some point, you might want to look around and see if you're leaking memory somewhere. And as Mark mentioned, Delphi doesn't use automatic garbage collection. Just in case you weren't aware of this, make sure you're either freeing your objects or handing their ownership off to something that will free them when they're no longer needed.


I have read about this before but have no direct experience. Calling WINAPI SetProcessWorkingSetSize() is supposed to "fix" the problem. Again I have no direct experience with this.


I recently had a very similar problem with my program. See my question: Why does my Delphi program’s memory continue to grow?

Despite my being convinced it was something else, it turned out to be major memory leaks caused by just a few mistakes in my code to free the memory.

Before you do anything else, be absolutely certain you are releasing all your memory properly.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜