PHP curl: curl_setopt() fopencookie failed
I'm trying to write a script, which would cache images, but am stuck with the following error message:
Nov 4 12:55:19 centos httpd: PHP Fatal error: curl_setopt() [<a href='function.curl-setopt'>function.curl-setopt</a>]: fopencookie failed in /var/www/html/proxy3.php on line 6
I have prepared 开发者_JAVA百科a simpler script which still has this problem:
<?php
#phpinfo();
$fh = fopen('/tmp/yahoo.html', 'xb');
if ($fh) {
$ch = curl_init('http://www.yahoo.com/');
curl_setopt($ch, CURLOPT_FILE, $fh); # XXX the line 6
curl_setopt($ch, CURLOPT_HEADER, FALSE);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
#curl_setopt($ch, CURLOPT_COOKIEJAR, '/dev/null');
#curl_setopt($ch, CURLOPT_COOKIEFILE, '/dev/null');
#curl_setopt($ch, CURLOPT_COOKIEJAR, '/tmp/cookies.txt');
#curl_setopt($ch, CURLOPT_COOKIEFILE, '/tmp/cookies.txt');
curl_exec($ch);
if(!curl_errno($ch)) {
$info = curl_getinfo($ch);
echo 'Took '.$info['total_time'] .
's to send a request to '.$info['url'];
}
curl_close($ch);
fclose($fh);
} else {
echo 'Can not open /tmp/yahoo.html';
}
?>
In the /tmp dir I then see a zero-sized file:
afarber@centos:html> ls -alZ /tmp/yahoo.html
-rw-r--r-- apache apache user_u:object_r:httpd_tmp_t /tmp/yahoo.html
Does anybody please have an idea, what is going wrong here?
I've tried setting/not setting CURLOPT_COOKIEJAR and CURLOPT_COOKIEFILE - to /dev/null and/or to /tmp/cookies.txt. I've tried sudo touch /tmp/cookies.txt; sudo chown apache.apache /tmp/cookies.txt too. This just doesn't work.
Actually I don't need cookies in my script, I'd be happy to disable them in curl.
I'm using fopen(... , "xb") on purpose, so that only 1 script instance will write to the cached file in my real script.
I'm using CentOS 5.5 with php-5.1.6-27.el5 and unmodified php.ini
Thank you, Alex
P.S. And here is my real image proxy script, which fails with the same fopencookie error message. I can't use fopen(...., 'wb') there, I must use fopen(.... 'xb'):
<?php
define('MIN_SIZE', 1024);
define('MAX_SIZE', 1024 * 1024);
define('CACHE_DIR', '/var/www/cached_avatars/');
$img = urldecode($_GET['img']);
# URL sanity checks omitted here for brevity
$cached = CACHE_DIR . md5($img);
$writefh = @fopen($cached, 'xb');
# the file is not cached yet, download it!
if ($writefh) {
$ch = curl_init($img);
curl_setopt($ch, CURLOPT_FILE, $writefh);
curl_setopt($ch, CURLOPT_HEADER, FALSE);
#curl_setopt($ch, CURLOPT_REFERER, $matches[1]);
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
#curl_setopt($ch, CURLOPT_COOKIEJAR, '/dev/null');
#curl_setopt($ch, CURLOPT_COOKIEFILE, '/dev/null');
#curl_setopt($ch, CURLOPT_COOKIEJAR, CACHE_DIR . 'cookies.txt');
#curl_setopt($ch, CURLOPT_COOKIEFILE, CACHE_DIR . 'cookies.txt');
curl_exec($ch);
$error = curl_errno($ch);
$length = curl_getinfo($ch, CURLINFO_SIZE_DOWNLOAD);
$mime = curl_getinfo($ch, CURLINFO_CONTENT_TYPE);
$is_image = ($mime != NULL &&
(stripos($mime, 'image/gif') !== FALSE ||
stripos($mime, 'image/png') !== FALSE ||
stripos($mime, 'image/jpg') !== FALSE ||
stripos($mime, 'image/jpeg') !== FALSE));
curl_close($ch);
fclose($writefh);
if ($error || $length < MIN_SIZE || $length > MAX_SIZE || !$is_image) {
unlink($cached);
exit('Download failed: ' . $img);
}
} else {
$finfo = finfo_open(FILEINFO_MIME);
$mime = finfo_file($finfo, $cached);
$length = filesize($cached);
finfo_close($finfo);
}
$readfh = fopen($cached, 'rb');
if ($readfh) {
header('Content-Type: ' . $mime);
header('Content-Length: ' . $length);
while (!feof($readfh)) {
$buf = fread($readfh, 8192);
echo $buf;
}
fclose($readfh);
}
?>
I think the problem is because of the x
mode. fopen
when used with x
mode returns false
if the file already exists. In your case the file already exists, $fh
will be false
and when this is passed to curl_setopt
you get this error.
To fix this try changing xb
to wb
.
If what you want is only one script to access the file at the same time, you should use cb
option + flock
:
$fh = fopen('/tmp/yahoo.html', 'cb');
if (flock($fh, LOCK_EX | LOCK_NB) {
//ftruncate($fh, 0); -- truncate the file if that's what you want
//continue as usual
}
else {
//could not obtain lock (without waiting)
}
Thanks for all the responses, I've ended up with this PHP/cURL-script for caching images (needed by Flash apps to circumvent a missing crossdomain.xml) - seems to work ok with CentOS 5 Linux and php-5.1.6-27.el5:
<?php
define('MIN_SIZE', 512);
define('MAX_SIZE', 1024 * 1024);
define('CACHE_DIR', '/var/www/cached_avatars/');
$img = urldecode($_GET['img']);
# img sanity checks omitted here
$cached = CACHE_DIR . md5($img);
if (is_readable($cached)) {
$finfo = finfo_open(FILEINFO_MIME);
$mime = finfo_file($finfo, $cached);
$length = filesize($cached);
finfo_close($finfo);
} else {
$writefh = fopen($cached, 'wb');
if ($writefh) {
flock($writefh, LOCK_EX);
$ch = curl_init($img);
curl_setopt($ch, CURLOPT_FILE, $writefh);
curl_setopt($ch, CURLOPT_HEADER, FALSE);
curl_setopt($ch, CURLOPT_REFERER, $matches[1]);
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
curl_setopt($ch, CURLOPT_AUTOREFERER, TRUE);
curl_exec($ch);
$error = curl_errno($ch);
$length = curl_getinfo($ch, CURLINFO_SIZE_DOWNLOAD);
$mime = curl_getinfo($ch, CURLINFO_CONTENT_TYPE);
$is_image = ($mime != NULL &&
(stripos($mime, 'image/gif') !== FALSE ||
stripos($mime, 'image/png') !== FALSE ||
stripos($mime, 'image/jpg') !== FALSE ||
stripos($mime, 'image/jpeg') !== FALSE));
curl_close($ch);
fclose($writefh);
if ($error || $length < MIN_SIZE || $length > MAX_SIZE || !$is_image) {
unlink($cached);
exit('Download failed: ' . $img);
}
}
}
$readfh = fopen($cached, 'rb');
if ($readfh) {
header('Content-Type: ' . $mime);
header('Content-Length: ' . $length);
flock($readfh, LOCK_SH);
while (!feof($readfh)) {
$buf = fread($readfh, 8192);
echo $buf;
}
fclose($readfh);
}
?>
精彩评论