Tools and Methods [closed]
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this questionWhat tools and procedures would you recommend or use yourself to help streamline the following sceanario: (I know its a long one but any help is appreciated)
I work in a team that have an ecommerce app that we develop at our company. Its a reasonably standard LAMP application that we have been developing on and off for about 3 years. We develop the application on a testing domain, here we add new features and fix bugs etc. Our bug tracking and feature development is all managed within a hosted subversion solution (unfuddle.com). As bugs are reported we make these fixes on the testing domain and then commit changes to svn when we are happy the bug has been fixed. We follow this same procedure with the addition of new features.
It is worth pointing out there the general architecture of our system and application across our servers. Each time a new feature is developed we roll this update out to all sites using our application (always a server we control). Each site using our system essentially uses exactly the same files for 95% of the codebase. We have a couple of folders within each site which contain files bespoke to that site - css files / images etc. Other than that the differences between each site are defined by various configuration settings within each sites database.
This gets on to the actual deployment itself. As and when we are ready to roll out an update of some kind we run a command on the server that the testing site is on. This performs a copy command (cp -fru /testsite/ /othersite/) and goes through each vhost force updating the files based on modified date. Each additional server that we host on has a vhost that we rsync the production codebase to and we then repeat the copy procedure on all sites on that server. During this process we move out the files we dont want to be overwritten, moving them back when the copy has completed. Our rollout script performs a number of other function such as applying SQL commands to alter each database, adding fields / new tables etc.
We have become increasingly concerned that our process is not stable开发者_高级运维 enough, not fault-tolerant and is also a bit of a brute-force method. We're also aware we are not making best use of subversion as we have a position where working on a new feature would prevent us from rolling out an important bug fix as we are not making use of branches or tags. It also seems wrong that we have so much replication of files across our servers. We're also not able to easily perform a rollback on what we have just rolled out. We do perform a diff before each rollout so we can get a list of files that will be changed so we know what has been changed after but the process to rollback would still be problematic. In terms of the database i've started looking into dbdeploy as a potential solution. What we really want though is some general guidance about how we can improve our file management and deployment. Ideally we want the file management to be more closely linked to our repository so a rollout / rollback would be more connected to svn. Something like using the export command to make sure the site files are the same as the repo files. It would also be good though if the solution maybe would also stop the file replication around our servers.
Ignoring our current methods it would be really good to hear how other people approach the same problem.
to summarise...
What is the best way for making files across multiple servers stay in sync with svn?
How should we prevent file replication? symlinks / something else?
How should we structure our repo so we can dev new features and fix old ones?
How should we trigger rollouts/rollbacks?
Thanks in advance.
For rollback and testing out new features, the standard subversion concepts of branches and tags should be sufficient:
- always create a tag before rollout, and roll out that tag. Rollback would then mean to return to the previous tag.
- develop new features in branches and merge to the trunk when completed; alternatively, develop new features in trunk, and have a maintenance branch that receives only bug fixes.
- keep the per-site files in separate directories in subversion, and use a configuration file on each site, or a symbolic link, to have sites refer to their specific files.
To reduce file duplication, I recommend to use NFS (in particular when all sites are virtual machines on the same host - make the host the NFS server, and the sites NFS clients; alternatively, make a dedicated VM the NFS server). To deploy an update, only install the new files on the NFS server; the clients will pick up changes automatically.
If you need a multi-step update (e.g. first update the databases in each client, then update the code), you should still use NFS, but add symlinks to that. Check out the new code into a separate directory on the NFS server, then go to all VMs, update the databases, and change the symlink in the VM to point to the new code. When done, remove the old code on the NFS server.
you may want to look at this article which covers deployment of PHP apps. http://blog.digitalstruct.com/2009/10/07/deployments-php-applications/
Specifically it mentions a few tools which might help:
- Phing
- Ant
- Liquibase
- DbDeploy
I have also heard a few people mentioning using capistrano so you might want to look at that too.
EDIT:
from looking at this poll http://twtpoll.com/3zwfox it seems that SVN export is a common method in the community for deploying php apps. This poll seems to have been used in this slideshare presentation http://www.slideshare.net/ccornutt/taming-the-deployment-beast
精彩评论