MSBuild / Visual Studio distributed builds
I develop / maintain an application that takes a long time to build (as in, a full build takes in excess of six hours!). After having spent most of the day building our application I've started looking into ways of improving build time. The suggestions on this Stack Overflow question were:
- Fixing compile warnings
- Unity builds (for developers)
- Distributed builds
I'd like to know more about how to do the third option (distrib开发者_高级运维uted builds) for a MSBuild / Visual Studio build system.
Go have a look at http://www.xoreax.com/ for Incredibuild. It is not free, but we use it and it's pretty impressive. It is well integrated into Visual Studio and extremely simple to use. You can run into a problem every now and then, but it's definitely worth a look.
Once it's installed, in Visual Studio, the principle is to use the "Build solution with Incredibuild" menu entry instead of the "build solution" one. All needed files are transparently transferred to distant computers and output files are downloaded back.
I have spent way too much time over the last ten years making C++ builds go much faster, and parallel builds can do wonders, but simple things sometimes make a surprising difference.
You didn't state many specifics, so I will ask...
What is the magnitude of the project? How many source files, etc. A metric I have used in the past is to target an average of one second per source module (.cpp). With sources ranging from 200 to over 32,000 (not a typo!) lines this has worked well as a metric in my past.
What are the specifications of your build machine, and is it only a build machine or is it being used for other work concurrently? Silly slow hard drives, too little RAM, other process use; all can cause disastrous effects upon build time.
Are you building monolithic static libraries and application(s)? If so, sometimes conversion to DLL files will result in lower overall build times as it will take the individual linkage units down in magnitude. This is particularly true of optimized builds with link-time code generation.
Are your projects efficiently using precompiled headers? If your project is set to automatically generate precompiled headers it is my assertion that you should test a build by turning OFF all precompiled header usage. In Visual Studio 2003 and Visual Studio 2005 I found this to actually be faster and more reliable than the automatic generation. Proper precompiled headers (generated by one file that does only that and used by all other files) seem to be the course for best build speed. It is, sadly, a truly black art and somewhat trial-and-error to get the best PCH content for a given project - and that content is NOT necessarily stable for the lifetime of the project.
The above are things which will only help, in addition to your parallelism quest.
Investigate the feasibility of splitting your project into multiple projects that can be built independently. You could have a project for UI, DLL projects for a data access layer, business logic, etc. This should make debugging quite a bit easier and structure your project in such a way that it lends itself to distributed build.
In theory with Visual Studio 2010 you could write your own msbuild tasks that could schedule tasks and stream code to other machines for compilation.
With previous versions of Visual Studio you could override cl.exe, lib.exe, and link.exe and create your own wrappers. This technique has been successfully used for adding compiling and linking of unsupported platform types.
TeamBuild will give you a distributed build system that integrates nicely with Visual Studio.
Scott Hanselman has a post on parallel builds here. It is not full distributed, but it might help some.
Also a note MSBuild does not build C++ projects itself. It actually calls vcbuild to do the dirty work for it.
精彩评论