开发者

5-minute file cache in PHP

I have a very simple question: what is the best way to dow开发者_StackOverflow社区nload a file in PHP but only if a local version has been downloaded more than 5 minute ago?

In my actual case I would like to get data from a remotely hosted csv file, for which I currently use

$file = file_get_contents($url);

without any local copy or caching. What is the simplest way to convert this into a cached version, where the end result doesn't change ($file stays the same), but it uses a local copy if it’s been fetched not so long ago (say 5 minute)?


Use a local cache file, and just check the existence and modification time on the file before you use it. For example, if $cache_file is a local cache filename:

if (file_exists($cache_file) && (filemtime($cache_file) > (time() - 60 * 5 ))) {
   // Cache file is less than five minutes old. 
   // Don't bother refreshing, just use the file as-is.
   $file = file_get_contents($cache_file);
} else {
   // Our cache is out-of-date, so load the data from our remote server,
   // and also save it over our cache for next time.
   $file = file_get_contents($url);
   file_put_contents($cache_file, $file, LOCK_EX);
}

(Untested, but based on code I use at the moment.)

Either way through this code, $file ends up as the data you need, and it'll either use the cache if it's fresh, or grab the data from the remote server and refresh the cache if not.

EDIT: I understand a bit more about file locking since I wrote the above. It might be worth having a read of this answer if you're concerned about the file locking here.

If you're concerned about locking and concurrent access, I'd say the cleanest solution would be to file_put_contents to a temporary file, then rename() it over $cache_file, which should be an atomic operation, i.e. the $cache_file will either be the old contents or the full new contents, never halfway written.


Try phpFastCache , it support files caching, and you don't need to code your cache class. easy to use on shared hosting and VPS

Here is example:

<?php

// change files to memcached, wincache, xcache, apc, files, sqlite
$cache = phpFastCache("files");

$content = $cache->get($url);

if($content == null) {
     $content = file_get_contents($url);
     // 300 = 5 minutes 
     $cache->set($url, $content, 300);
}

// use ur $content here
echo $content;


Here is a simple version which also passes a windows User-Agent string to the remote host so you don't look like a trouble-maker without proper headers.

<?php

function getCacheContent($cachefile, $remotepath, $cachetime = 120){

    // Generate the cache version if it doesn't exist or it's too old!
    if( ! file_exists($cachefile) OR (filemtime($cachefile) < (time() - $cachetime))) {

        $options = array(
            'method' => "GET",
            'header' => "Accept-language: en\r\n" .
            "User-Agent: Mozilla/5.0 (Windows; U; MSIE 7.0; Windows NT 6.0; en-US)\r\n"
        );

        $context = stream_context_create(array('http' => $options));
        $contents = file_get_contents($remotepath, false, $context);

        file_put_contents($cachefile, $contents, LOCK_EX);
        return $contents;

    }

    return file_get_contents($cachefile);
}


If you are using a database system of any type, you could cache this file there. Create a table for cached information, and give it at minimum the following fields:

  • An identifier; something you can use to retrieve the file the next time you need it. Probably something like a file name.
  • A timestamp from the last time you downloaded the file from the URL.
  • Either a path to the file, where it's stored in your local file system, or use a BLOB type field to just store the contents of the file itself in the database. I would recommend just storing the path, personally. If the file was very large, you definitely wouldn't want to put it in the database.

Now, when you run the script above next time, first check in the database for the identifier, and pull the time stamp. If the difference between the current time and the stored timestamp is greater than 5 minutes pull from the URL and update the database. Otherwise, load the file from the database.

If you don't have a database setup, you could do the same thing just using files, wherein one file, or field in a file, would contain the timestamp from when you last downloaded the file.


First, you might want to check the design pattern: Lazy loading.

The implementation should change to always load the file from local cache. If the local cache is not existed or file time jitter longer than 5 minute, you fetch the file from server.

Pseudo code is like following:

$time = filetime($local_cache)
if ($time == false || (now() - $time) > 300000)
     fetch_localcache($url)  #You have to do it yourself
$file = fopen($local_cache)


Best practice for it


$cacheKey=md5_file('file.php');


You can save a copy of your file on first hit, then check with filemtime the timestamp of the last modification of the local file on following hits.


You would warp it into a cache like method:

function getFile($name) {
    // code stolen from @Peter M
    if ($file exists) {
      if ($file time stamp older than 5 minutes) {
         $file = file_get_contents($url)
      }
    } else {
         $file = file_get_contents($url)
    }
    return $file;
}


I think you want some (psuedo code) logic like:

if ($file exists) {
  if ($file time stamp older than 5 minutes) {
     $file = file_get_contents($url)
  }
} else {
     $file = file_get_contents($url)
}

use $file
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜