开发者

How to download/copy file from server A to server B?

I am using this code to download a package from server A and put it in server B (copy)..but it does not work always, sometimes transfer does not get completed, file is not complete and some times it goes well. Can I improve this code in anyway or use cURL to do the same thing?

This is my code:

// from server a to server b
$filename = 'http://domain.com/file.zip';
$dest_folder = TEMPPATH.'/';

$out_file = @fopen(basename($filename), 'w');
$in_file = @fopen($filename, 'r');

if ($in_file && $out_file) {

    while ($chunk = @fgets($in_file)) {
    @fputs($out_file, $chunk);
    }
    @fclose($in_file);
    @fclose($out_file);

$zip = new ZipArchive();
$result = $zip->open(basename($filename));开发者_StackOverflow中文版
if ($result) {
    $zip->extractTo($dest_folder);
    $zip->close();
}

}

Problem is that it is not consistent. It does not get transfered all times, many times it comes missing and the script does not run well.


$filename = 'http://domain.com/file.zip';
echo `wget $filename`;
echo `unzip $filename`;

or

  $ch = curl_init();
  $timeout = 5;
  curl_setopt($ch,CURLOPT_URL,$url);
  curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);
  curl_setopt($ch,CURLOPT_CONNECTTIMEOUT,$timeout);
  $data = curl_exec($ch);
  curl_close($ch);
  fwrite(fopen($destfile,'w'),$data);

Really though, you need to figure out why it is failing. Is the zip operation killing it? Is the php script timing out because it took too long to execute? Is it running out of memory? Is the server on the other end timing out? Get some error reporting and debug data and try to figure out why it's not working. The code you have should be fine, and reliable.


  1. Have you checked the timeout settings on your server. Maybe they are causing your script to timeout before the code is executed completely.
  2. Make sure that you are allowed to open the external url by fopen in your servers settings. And you also have right access settings to get this file.
  3. Make sure the firewall of the Server A is allowing Server B and not just blocking its ip.
  4. Try to use curl or file_get_contents and file_put_contents. It is likely to work too and prevents the loops.
  5. Check if the problem is with the ZipArchive class or getting the file itself.


The fact that you are having temperamental issues suggests you may be having the same issue I've encountered - which has nothing to do with the code.

I'm pulling my zip from a remote server using cURL and then extracting the locally saved zip. Sometimes it works, sometimes not... this caused some serious hair pulling to begin with.

I'm uploading my zip via filezilla and what I have found is it frequently crashes out, retries a few times and eventually works. The uploaded file has the correct file size and looks like it's successfully uploaded but if I download it again sometimes it's simply corrupted and can't be unzipped.

So long as I make sure my uploaded zip is fine my script works fine... so here it is:

$zip_url = "http://www.mydomain.com.au/";
$version = "1.0.1.zip"; // zip name

$ch = curl_init();
$tmp_zip = fopen($version, 'w'); // open local file for writing
curl_setopt($ch, CURLOPT_URL, "$zip_url$version"); // pull remote file
curl_setopt($ch, CURLOPT_FILE, $tmp_zip); // save to local file
$data = curl_exec($ch); // do execute
curl_close($ch);
fclose($tmp_zip); // close local file

// extract latest build
$zip = new ZipArchive;
$zip->open($version);
$result = $zip->extractTo("."); // extract to this directory
$zip->close();

if ($result) @unlink($version); // delete local zip if extracted
else echo "failed to unzip";

One big difference in my code from the previous answer is I'm using CURLOPT_FILE rather than CURLOPT_RETURNTRANSFER. You can read why CURLOPT_FILE is better for large transfers at: www.phpriot.com/articles/download-with-curl-and-php

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜