开发者

.Net Why can't I get more than 11GB of allocated memory in a x64 process?

I thought that the maximum user space for a 64bit process was 8TB, but I did a little test and the maximum I could get is 10-11GB.

Note: I don't need that much memory in a process, I just want to understand why out of curiosity.

Here is my test program:

static void Main(string[] args)
{
    List<byte[]> list = new List<byte[]>();

    while (true)
    {
        Console.WriteLine("Press any key to allocate 1 more GB");
        Console.ReadKey(true);
        list.Add(new byte[1024 * 1024 * 1024]);

        Console.WriteLine("Memory size:");
        double memoryUsage = Process.GetCurrentProcess().PeakVirtualMemorySize64 / (double)(1024 * 1024 * 1024);
        Console.WriteLine(memoryUsage.ToString("0.00") + " GB")开发者_运维技巧;
        Console.WriteLine();
    }
}

EDIT:

Updated the test program to be more deterministic.

To accept an answer I would like to know how the real maximum allocated memory is calculated if 8TB is only theoretical.


It's your machine.

I have an x64 with 8GB RAM and 12GB pagefile, and I ran your program and it topped out at 16.23GB.

EPILOG: Then my Win7 install gradually slid into a coma as critical processes were apparently memory starved.

EDIT: If you want to understand how Windows allocates (i.e. reserves and commits) memory, read Pushing the Limits of Windows: Physical Memory and Pushing the Limits of Windows: Virtual Memory.

Since .Net relies on Windows to manage the memory it uses to build the GC heap, the mechanics of how Windows does this are reflected in how memory is allocated in .Net on a low level.


It's up to 8 TB, not 8 TB. You can potentially have up to 8 TB, but you need the matching RAM/swapfile.


I'm guessing its because you're using a List, which I believe has an internal limit.

See what you can get if you try something like creating your own old school list:

public class ListItem<T>
{
    public ListItem Parent;
    public T Value;

    public ListItem(ListItem<T> parent, T item)
    {
        this.Parent = parent;
        this.Value = item;
    }
}

I have written almost that exact code before (only my item was an int) and run it on a machine with 32 processors and 128 GB of ram, it always crapped out at the same size no matter what, and was always something related to Int32.MaxValue, hope that helps.


Try allocating one chunk (as opposed to a list of 1MB chunks):

Dim p As IntPtr = System.Runtime.InteropServices.Marshal.AllocHGlobal(New System.IntPtr(24 * (1024 ^ 3)))

Edit - Given your comment, that you only have 4GB of physical RAM, you really have no business allocating > ~8GB and even that is pushing it.

Edit -

To accept an answer I would like to know how the real maximum allocated memory is calculated if 8TB is only theoretical.

The maximum amount of RAM you can allocate is probably equivalent to the (Page File Size - Size of Everything in RAM except that which cannot or will not be paged) + (Physical RAM Size - Size of everything that cannot or will not be paged i.e. that which is needed to keep your system going... kernel, drivers, .net stuff etc...)

Of course the page file can grow...

Sooner or later paging to/from disk becomes to much and your system slows to a crawl and becomes unusable.

Read Mark Russinovich's blog:

  • Pushing the Limits of Windows: Physical Memory
  • Pushing the Limits of Windows: Virtual Memory
  • Pushing the Limits of Windows: Paged and Nonpaged Pool
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜