开发者

Whole program optimization failing in VC2008

I have a reasonably large C++ program (~11mb exe) compiled under VS2008 and was interested to see if whole program optimization would significantly affect its performance. However, turning on whole program optimization and link time code generation causes the link to fail as follows;

1>c:\cpp\Win32\Atlas\tin\TINDoc.Cpp : fatal error C1083: Cannot open compiler intermediate file: '.\releaseopt\TINDoc.obj': Not enough space
1>LINK : fatal error LNK1257: code generation failed

Looking at task manager, I can see the linker using more and more memory until it runs out and bombs out. The compiler is running on XP 32bit with 2GB or ram and 2gb page file. Is WPO limited to smaller applications and/or bigger environments, or is there any way to get the linker to be a bit more frugal in memory usage.

n.b. already turned of precompiled headers, which was causing the compilation to fail before linking, and turned off output of debug info and anything else that might take extra resources. The help for C1083 suggests missing header files or inadequate file handles rather than lack of space.

Edit: Got it working under VS2010, albeit without precompiled headers, but the performance gains aren't that significant. I'll leave this option alone until I move onto 开发者_运维技巧a beefier 64bit platform with a more robust version of VS2010.


VC2008 is a fragile beast. The optimiser just doesn't work for some cases, and looks like you might possibly have one such case.

Examples of "not working" include

  • De-optimised code (slow!)
  • Compiler or (more frequently) linker crashes
  • Obscure compile/link error messages
  • Incorrect code execution (this one is rare, but not unknown).

Actually, if you look at some of the tricks used by modern optimisers it is pretty amazing that it works as often as it does. The complexity is quite astounding.

Addressing Shane's problem specifically, some things that might cause your error:

  1. Running out of file handles used to be a big problem in DOS and early Windows versions. You hardly ever see it in modern Windows versions. I'm not even certain there is still a limit on file handles.

  2. A compiler bug could be doing an infinite loop, meaning that no amount of resources will be enough.

  3. A recursive include file could also cause something similar. Do all your include files have "#pragma once" or "#if !defined(FOO_INCLUDED)"?

  4. Is it possible you've included the file TINDoc.obj twice in the project? The 2008 compiler is aggressively multi-threaded and there might be contention between two threads trying to access the file. Actually this could happen via a compiler bug, even if you haven't included the file twice.

  5. If the program really is just too big, consider breaking it up into one or more DLLs and building it piecemeal, or splitting the contentious source file into multiple files to compiler in multiple phases.

Don't assume that, because it says "not enough space" that has to be the reason. Some programs (including some compilers) will guess a reason for an error, instead of checking.

You can monitor the use of memory, handles, etc. using task manager or perfmon, or (my preference) Process Explorer

I already posted the first part of this as a comment (above) but I an making it an answer, as suggested by Ben (http://stackoverflow.com/users/587803/ben) -- and expanded it somewhat.


You can try to boot XP with the /3G switch to give the linker a 3GB address space. In Vista/7, use BCDEDIT /Set IncreaseUserVa 3072 in the command line to get the same effect.

This has fixed problems creating .ilk-files for me.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜