开发者

How big of an impact does including PHP files have on performance?

Question pretty much states it all, I am working on a large project where most calls to php include() between 100 and 150 files.开发者_如何学C On average the time php takes is between 150 and 300 ms. I'm wondering how much of this is due to including PHP scripts? I've been thinking about running a script that checks most accessed files for particular calls and merge them into one file to speed things up, but for all I know this has zero impact.

I should note that I use APC, I'm not fully aware of what APC does in the background, but I would imagine it might already cache my files somehow so the amount of files doens't really make a big difference?

Would appreciate any input on the subject.

Of course, 300ms isnt much, but if I can bring it down to say, 100 or even 50ms, thats a significant boost.

Edit:

To clarify I am talking about file loading by php include / require.


File loading is a tricky thing. As others have said, the only sure fire way to tell is to do some benchmarks. However, here are some general rules that apply only to PHP loading, not files with fopen:

  • APC will store its opcode cache in shared memory so you will take a hit on the first load but not subsequent loads.
  • include and include_once (and their require cousins) are actually quite heavy. Here are some tips to improve their speed:
    • Use absolute paths to your files (avoid relative paths like ../foo.php)
    • Both the _once functions need to check to make sure that the file wasn't also included via a symbolic link since a symbolic link can produce multiple paths to the same file. This is extremely expensive. (see next point)
  • It is much cheaper to load only the files you need than to call include. Make use of auto-loaders to only load classes when they are needed.
  • Local disks will almost always be a better bet than networked storage. When possible, if you have multiple servers, keep copies of the source code on each server. It means you need to update multiple places during a release but it is worth the effort in performance.

Overall it is dependent on your hard disk speed. But compared to not loading a file at all or loading it from RAM, file loading is incredible slow.

I hope that helped.


That is quite a bit of files, but not to be unexpected if using a framework (Zend by any chance?). The impact of including that many files is mainly dependent on your server's hard drives' speed. Regardless, file access is extremely slow so if you can, reduce the number of includes.

APC does/can cache the opcodes for all those files in memory though, meaning no more disk seeks until the cache is invalidated/destroyed.

  • Try turning APC off and see how much of a difference it makes. There should be a noticeable spike in execution time.

  • Try profiling the script with xdebug. You'll most likely find that there are other issues (code issues) that affect performance more than the file access.


I know this is a super old question, but I was wondering the same thing and Google brought me here. I wound up doing some benchmarks using PHP 7.4 (Now end-of-lifed) on a small, general purpose Digital Ocean server, which has an SSD drive.

Methodology:

I created 260 classes, each in its own file. Files are named something like A0.php, A1.php, A2.php, ..., Z8.php, Z9.php. At 608 bytes, they are small, so keep that in mind when reading results.

I also created a single file, LotsOfClasses.php, with all 260 classes in it. The file was approximately 155k in size.

Scripts were run from the command line with no caching by PHP. One should assume disk caching was working as it normally would by the underlying CentOS operating system.

Here are what the test scripts basically look like (microtime was chosen over hrtime as it's easier to read by humans):

<?php
$start = microtime(true);
foreach (range('A', 'Z') as $letter) {
    foreach (range(0, 9) as $i) {
        include "{$letter}{$i}.php";
    }
}
echo (microtime(true) - $start) . "\n";

and

<?php
$start = microtime(true);
include "LotsOfClasses.php";
echo (microtime(true) - $start) . "\n";

I then ran each script in a batch of 100, 3 times:

for i in {1..100}; do php benchmark.php; done > time.txt

I also wrote a script to calculate the average of times. Nothing fancy, but including here for completeness:

<?php
$file = file_get_contents("time.txt");
$values = array_filter(explode("\n", $file));
echo (array_sum($values) / sizeof($values)) . "\n";

Command Line Results:

The average time it took to include 1 file with 260 classes was 11.79 ms.

The average time it took to include 260 files, each with one class was 21.42 ms

Web Server Results:

Astute readers may ask how including a single file of approximately 155k can take 11.79ms. Does that mean if your application loaded 100 such files it would take over 1 second just to include files? If using the command line, the answer is yes. However, PHP is generally run via a server. When running on the command line, PHP has to parse the file each time. When loading files via the server, PHP will use its Opcache, which is significantly faster.

The average time it took to include 1 file with 260 classes was 0.056 ms.

The average time it took to include 260 files from the web server was 0.65ms.

Conclusion:

While one large file was almost instantaneous, there was only 0.6 ms difference between the 1 file and 260 smaller files when loading via a web server. The increase is negligible and combining files would largely be a micro-optimization. Even in a larger app, I can't imagine it being more than a couple milliseconds difference. In addition, it's likely a single file would be more complex to maintain. I'd say the tradeoff of loading multiple maintainable files for a negligible increase in time is worth it.

For what it's worth, I also tried include_once with the 260 file version on a web server and it only increased the time by 0.08ms.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜