开发者

Uploading PHP scripts to a live environment

I have a PHP app that requires semi-frequent code updates. What I do now is that I bring down the app for maintenance whenever I have to upload new scripts, effectively turning the app off for all users except myself.

If I don't do this I always see a lot of "unexpected $end" error messages in the logs, as PHP tries to interpret half-uploaded scripts. Which I of course want to avoid.

My question is: Is there a safe way of doing this without bringing the app 开发者_C百科down for maintenance? In an environment with a lot of simultaneous users, would uploading to a temp directory and then moving the files locally on the server be fast enough to avoid those errors? Can it be automated somehow for a convenient workflow?

Thanks!


I think a good practice is to create a checkout for the new release (you should use a version control system), and after everything in in it's place, just symlink to the new directory, like this:

Before update:

/live => Symlink to /release-2011-02-01

After update:

/live => Symlink to /release-2011-02-02

Then, you can cleanup old releases after some time.


One very simplistic approach that works with FTP is is to have two directories:

/site_live
/site_shadow

when a new version comes up, upload it to site_shadow. When it's done, rename site_live to site_shadow and vice versa.

This works fine and without interruptions; if you store any user data in the app, you'd have to move those directories too.

The extended version of this works with version numbers from your version control software, storing each revision in a specific directory.

/site_live 
/site_101
/site_102
/site_106

Instead of renaming directories, the symlink approach shown by @schneck is the best by far if it's available in your environment!


If you frequently need to update Scripts, then you have a design flaw in your App. You would only change a script if there is a bug in it. A major update should be required if you want a compete overhaul of your App.

If the changes are everyday changes (say price of an Item for EShop), then you should maintain some sort of Database table, and have these values updated there, preventing any need to update a Script.

As of this moment, you may consider redesigning your App keeping this in mind.


I think the safest way to accomplish this is to create a shadow copy of your site for environment testing.

You would have a separate sub domain like so:

http://dev.mysite.com

which requires a http authentication and ip validation to prevent anyone having access to this, you you would build a migration script to handle the process of moving the files from dev to the main live directory.

The way I would do this is like so:

  • Initiate maintenance mode on the live so
  • Fire the migration script
  • Check through the site where the new code is effective
  • remove the maintenance mode

The migration script is not to hard to make, its basically a script that would do the following:

  • Back your live site up to a secret location
  • Recursively go threw your directory structure for the live site several times
    • First Iteration: If a file exists on dev but not on live then copy the file there
    • Second iteration: If the file exists on dev and live then check md5 hashes and last modified time stamps to see of the file should be over written.
    • Third iteration: Create an md5 sum file that will be used for the next update.
  • Then you would modify your revisions accordingly.

Building a process like this is simple but does require some work, creating an administration panel to help with this will really make your life easier.


After posting my answer I would advise you to go with @schneck's answer as its much more manageable.


There's lots of ways to reduce the problem - most of the other answers boil down to making a local mirror of the site, applying the changes there then switching the mirror in place of the live code. In effect these are just reducing the amount of time for the new files to be written - they don't actually eliminate the problem. e.g. suppose you've got 20 files to update, and the file at the top of the list requires one from the bottom of the list?

I'm guessing that this is a managed service - a VPS, or a dedicated/shared host - this eliminates the possibility of transfering the accessible IP address to another cluster node while the file update operation takes place. This approach can completely eliminate downtime assuming you don't have dependencies on a common storage substrate (e.g. sticky session handling, database structure).

It is possible to suspend activity to a certain extent using a auto-prepend script, while the operation completes, something like......

<?php

if (file_exists('/somewhere/uploading_new_content')) {
     // following should be implemented as a seperate file
     // which is only included when necessary....
     $targettime=30; // assuming the update takes 30 seconds....
     $sleeptime=$targettime - time() + filemtime('/somewhere/uploading_new_content');
     sleep($sleeptime);
     print "Sorry - Your request was paused due to essential maintenance<br />\n";
     $dest=$_SERVER['REQUEST_URI'];
     if (count($_POST)) {
         print "<form method='POST' action='$dest'>";
         foreach ($_POST as $key=>$val) {
             $val=htmlentities($val);
             $key=htmlentities($key);
             print "<input type='hidden' name='$key' value='$val'>\n";
         }
         print "<input type='submit' name='uploading_pause' value='continue'>\n";
         print "</form>\n";
     } else {
         print "<a href='$dest'>continue</a>";
     }
     exit;
}
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜