开发者

Why does 32-bit application use less RAM than 64-bit version, even though 32-bit executable is larger

I have created an application in .NET. 开发者_运维百科When I compile a 64bit version and a 32bit version of the same software, the 64bit executable is smaller.

However, when you run them both, the 64bit version uses more RAM.

I'm sure something is happening "under the hood", and was just interested why? (It's not a worry either way)

Thanks.

EDIT: C#.NET 4.0 if it matters.


In 32 bit applications, pointers are 32 bits i.e. 4 bytes, whereas they are 64 bits i.e. 8 bytes in 64 bit applications. So pointers (e.g. object reference) take up twice as much memory.

Also, in 32-bit applications objects have an overhead of 12 bytes per object, whereas in 64 applications they have an overhead of 24 bytes. Double again.

These affects will be noticed at runtime, not in the dll size.


Pointers are twice as big in 64bit mode. That could explain some (sometimes much) of the RAM usage difference.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜