开发者

Speeding up HipHop compile and link time

After some work, I managed to get HipHop up and running on my server. But as I am learning and debugging the process of moving my code into something that works well with the HipHop transformer, I am finding that I have to recompile it often... Yet it takes 10+ minutes to build my 130 file codebase. It seems unusually long, in light of recent posts on the Facebook blog about improve compile times.

Any one had luck improving their compile time? It could just be general ignorance of an argument that I am missing... I am more knowledgeable of PHP than C++.

Information from Facebook, as well as my command and logs are included below.

Facebook

In a Facebook blog post, they say they can compile a huge binary quite quickly.

Besides optimizing the compiled code, a lot of effort has been spent on improving the compiler itself. Several phases in the compiler, including parsing, optimization, and code generation, are now parallelized. Hyves contributed changes to the generated C++ code to make it compile faster without losing any run-time efficiency. We can build a binary that is more than 1GB (after stripping debug information) in about 15 min, with the help of distcc. Although faster compilation does not directly contribute to run-time efficiency, it helps make the deployment process better.

http://www.facebook.com/notes/facebook-engineering/hiphop-for-php-more-optimizations-fo开发者_StackOverflowr-efficient-servers/10150121348198920

Compile Logging

$HPHP_HOME/src/hphp/hphp --input-list=files.list -k 1 --log=3 --include-path="." -v "AllDynamic=true" -o /root/stocks_hphp/

running hphp...
creating temporary directory /tmp/hphp_h3vCKc ...
parsing inputs...

#parsing inputs took 0'00" (330 ms) wall time
pre-optimizing...
pre-optimizing took 0'00" (150 ms) wall time
inferring types...
inferring types took 0'00" (160 ms) wall time
post-optimizing...
post-optimizing took 0'00" (100 ms) wall time
creating CPP files...
creating CPP files took 0'00" (590 ms) wall time
saving code errors...
compiling and linking CPP files...

compiling and linking CPP files took 11'50" (710315 ms) wall time


The answer's right there in the excerpt you quoted:

We can build a binary that is more than 1GB (after stripping debug information) in about 15 min, with the help of distcc.

distcc is a distributed compiler. They have a large set of machines that they use to compile in parallel. You probably won't get the same sort of performance on a single computer.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜