开发者

Caching includes in PHP for iterated reuse

Is there a way to cache a PHP include effectively for reuse, without APC, et al?

Simple (albeit stupid) example:

// rand.php
return rand(0, 999);

// index.php
$file = 'rand.php';
while($i++ < 1000){
    echo include($file);
}

Again, while ridiculous, this pair of scripts dumps 1000 random numbers. However, for every iteration, PHP has to hit the filesystem (Correct? There is no inherit caching functionality I've missed, is there?)

Basically, how can I prevent the previous scenario from resulting in 1000 hits to the filesystem?

The only consideration I've come to so far is a goofy one, and it may not prove effective at all (haven't tested, wrote it here, error prone, but you get the idea):

// rand.php
return rand(0, 999);

// index.php
$file = 'rand.php';
$cache = array();
while($i++ < 1000){
    if(isset($cache[$file])){
        echo eval('?>' . $cache[$file] . '<?php;');
    }else{
        $cache[$file] = file_get_contents($file);
        echo include($file);
    }
}

A more realistic and less silly example:

When including files for view g开发者_如何学Goeneration, given a view file is used a number of times in a given request (a widget or something) is there a realistic way to capture and re-evaluate the view script without a filesystem hit?


This would only make any sense if the include file was accessed across a network.

There is no inherit caching functionality I've missed, is there?

All operating systems are very highly optimized to reduce the amount of physical I/O and to speed up file operations. On a properly configured system in most cases, the system will rarely revert to disk to fetch PHP code. Sit down with a spreadsheet and have a think about how long it would take to process PHP code if every file had to be fetched from disk - it'd be ridiculous, e.g. suppose your script is in /var/www/htdocs/index.php and includes /usr/local/php/resource.inc.php - that's 8 seek operations to just locate the files - @8ms each, that's 64ms to find the files! Run some timings on your test case - you'll see that its running much, much faster than that.


As with Sabeen Malik's answer you could capture the output of the include with output buffering, then concat all of them together, then save that to a file and include the one file each time.

This one collective include could be kept for an hour by checking the file's mod time and then rewriting and re including the includes only once an hour.


I think better design would be something like this:

// rand.php
function get_rand() {
    return rand(0, 999);
}

// index.php
$file = 'rand.php';
include($file);

while($i++ < 1000){
    echo get_rand();
}


Another option:

while($i++ < 1000) echo rand(0, 999);
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜