开发者

Smart makefile for C++ at function level [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center. Closed 11 years ago.

Copy/pasting from:

http://forums.cgsociety.org/showthread.php?f=89&t=1010489

Didn't you find it quite annoying when you worked on a large .cpp, which used heavy template functions, that when you change one character in a function, you would have to wait sometimes a whole minute for the compilation of the whole file? Well some of you might have moved some parts of the large .cpp to new files, but I was considering a more automated process.

So I was wondering if there is a makefile and a compiler that supports building a .cpp at function level. It means that if you would change one function, only that function would be recompiled, instead of the whole file. If there isn't such a thing, here is a basic idea for a python script:

Preliminaries:

There would be a directory for each file*.cpp, which would contain the last copy of the .cpp from the last build (we will call it the old copy), and func*.cpp files for each function in file.cpp.

Now for the build process:

  1. If the file.cpp is newer than its old copy开发者_开发知识库 (compare by file date), then it needs rebuilding.
  2. Diff between the old and new file.cpp, and locate conclusively the changed functions. If something changed besides functions, recreate the whole file.cpp directory.
  3. For each changed function create a new func.cpp, which would contain all the headers, and the prototypes of all the functions that came before it in file.cpp.
  4. Run the regular build process for all func*.cpp files.

Some details:

  1. The would also be a file that would hold all the global variables, and each func*.cpp would have besides previous functions prototypes, extern declaration for previous global variables.
  2. Note that debugging would be made naturally on the func*.cpp files.
  3. The connection between a function symbol and its .cpp can be made through a symbol db file.

Comments:

  1. It would be a smart compilation step using any current compiler tools that would fit smoothly into the build pipe line.
  2. Think about the implications. It would be like every function would have a separate .cpp file. Can you imagine the boost in the average compilation time?
  3. For those of you who delegate the compilation process to the link time (Visual studio optimization section /gl /ltcg), be sure to note that you wait much longer, and it doesn't have much effect, at least for a daily working with the code. So I personally disable these.
  4. It seems that someone already gave the idea a name: http://en.wikipedia.org/wiki/Incremental_compiler Has anyone tried IBM VisualAge C++ compiler for windows?


There are some problems with the idea to compile only excerpts of the .cpp file.

  1. The compilation unit has more than just functions. You have also variables at file scope. Your excerpt builder would have to detect these variables and declare them as extern. And you have to create a separate excerpt for the variables too. If you have static variables at file scope that must be kept static and can't be declared extern for some of the excerpts.

  2. Even if it's treated as evil, the C++ preprocessor supports the #define what will affect the whole compilation unit. Your excerpt builder would have to detect all #defines that are not suppressed with #if clauses.

  3. You get different output files when you compile all excerpts in contrary to compiling the complete .cpp file. This affects the linking process.

This alltogether would make the compilation process much more complicated. It's much more woth to factor out the compilation unit.


While this idea could indeed help and reduce compilation time, there are a number of problems that would make it impractical. In addition to the problems pointed out by harper, I can think of these:

  • compilers have no memory, they don't keep track of the changes you make to source files.

  • optimizing compilers don't work on functions independently, they look at the code as a whole to decide what and how to write efficient code. To be able to use a smart rebuild technology you would need to disable optimizations.

  • inline functions would clearly not work with this.

  • methods of template classes would be difficult to support, as a change in one function will require changes in all the instantiations of the template.

My opinion is that the standard solutions to shorten compilation time will improve your situation enough that something like this would not be necessary. I recommend the following:

  • Do not #include header files inside other header files whenever possible. Use forward declarations instead.

  • Do not write function/method implementations in header files, except for inlines.

  • Keep the size of your source files short. I always write one class per file.

  • Do not expose class implementation details in header files. Techiniques such as the Pimpl Idiom are great for this.

With the above suggestions you can get very close to having to recompile just the file(s) that changed and nothing more.


Didn't you find it quite annoying when you worked on a large .cpp, which used heavy template functions, that when you change one character in a function, you would have to wait sometimes a whole minute for the compilation of the whole file?

A whole minute? No I don't actually. You've worried about this problem for much longer than a minute already. You're better off waiting, or perhaps not making such tiny changes to your source file.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜