How to do version control via ftp?
I have a web dev. client using a shared host that doesn't allow shell access, and thus no access to SVN, Git, etc. I've tried to convince him to move to one of the many cheap options that allow it, but he won't do it. If I use version control on my staging server开发者_如何转开发, are there any tools that will allow me to replicate the changes to production via ftp? Locally I have both mac & windows, the staging server is linux, so something that works on any of those platforms....
Using your Linux staging server you could keep a separate checked out copy that you use specifically for that host and then use a utility to mirror that directory with the host server.
LFTP is useful for this kind of thing. Its available for most Linux distributions and includes a 'mirror' function:
Mirror specified source directory to local target directory. If target directory ends with a slash, the source base name is appended to target directory name. Source and/or target can be URLs pointing to directories.
Some kind of ftp mirror software is what you need. Not tested it but a quick search gave me this Java application. You could run that over your up-to-date checked out repository.
Good thing for keeping SVN repo and FTP copy in sync is svn2web. May I suggest creating separate branch for production copy and do merges to that branch for uploading to production server.
You probably need to write a batch file that is able to
- Export the SVN repository
- Upload the exported files to your Linux server via FTP
Short of finding / implementing some FUSE based CoW file system that supports immutable versions .. I'd just find another (more developer friendly) host. As far as I know, no FTP server supports this natively, nor can I think of any elegant means of putting it in place with script hackery.
I could be wrong.
This question (and answer) really helped me just now as I implemented version control via gitolite
on a separate server and lftp.
Here’s what I did:
- Set up
gitolite
on my ubuntu staging server
- created base repo (i.e.
foo.git
) on staging server - cloned
foo.git
into working directory on staging server - cloned
foo.git
into working directory on local development machine
- created base repo (i.e.
- Developed locally
- Pushed changes to
foo.git
repo on staging server - On staging server, logged into working directory, and pulled in changes from
foo.git
lftp
-ed into shared host (like you mention above)- Once in shared host, ran:
mirror -R --only-newer --delete --parallel=10 /source/directory/ /target/directory
Notes on the mirror
command options:
-R
- this pushes thesource/directory
to thetarget/directory
. (mirror
pulls in from target to source without this, thinkr
everse)—only-newer
- without this option, even if you only changed one file, themirror
command will send all the files in the source directory over to the target directory. with this option only the changed (newer) files are transferred over the wire.—delete
- deletes files that are no longer in the source directory but still in the target directory. one of my pushes involved deleting expired assets. without this option, the same files would have stayed put on my shared host after executing themirror
command.—parallel=10
- transfers 10 files at once (instead of 1 by default). this made the process much faster
While this is what worked for me, I’m sure there are ways to improve on this. I was grateful for this question and thought i’d share my experience.
Rsync will do this over an FTP connection. You probably already have it installed if you’re on a Unix-like system.
精彩评论