开发者

How do I run my application code (PHP) across my various Amazon EC2 instances?

I've been trying to get to grips with Amazons AWS services for a client. As is evidenced by the very n00bish question(s) I'm about to ask I'm having a little trouble wrapping my head round some very basic things:

a) I've played around with a few instances and managed to get LAMP working just fine, the problem I'm having is that the code I place in /var/www doesn't seem to be shared across those machines. What do I have to do to achieve this? I was thinking of a shared EBS volume and changing Apaches document root?

b) Furthermore what is the best way to upload code and assets to an EBS/S3 volume? Should I setup an instance to handle FTP to the aforementioned shared volume?

c) Finally I have a basic plan for the setup that I wanted to run by someone that actually knows what they are talking about:

  • DNS pointing to Load Balancer (AWS Elastic Beanstalk)
  • Load Balancer managing multiple AWS EC2 instances.
  • EC2 instances sharing code from a single EBS store.
  • An RDS instance to handle database queries.
  • Cloud Front to serve assets directly to the user.

Thanks, Rich.

Edit: My Solution for anyone that comes across this on google.

Please note that my setup is not finished yet and the bash scripts I'm providing in this explanation are probably not very good as even though I'm very comfortable with the command line I have no experience of scripting in bash. However, it should at least show you how my setup works in theory.

All AMIs are Ubuntu Maverick i386 from Alestic.

I have two AMI Snapshots:

  • Master
    • Users
      • git - Very limited access runs git-shell so can't be accessed via SSH but hosts a git repository which can be pushed to or pulled from.
      • ubuntu - Default SSH account, used to administer server and deploy code.
    • Services
      • Simple git repository hosting via ssh.
      • Apache and PHP, databases are hosted on Amazon RDS
  • Slave
    • Services
      • Apache and PHP, databases are hosted on Amazon RDS

Right now (this will change) this is how deploy code to my servers:

  1. Merge changes to master branch on local machine.
  2. Stop all slave instances.
  3. Use Git to push the master branch to the master server.
  4. Login to ubuntu user via SSH on master server and run script which does the following:
    1. Exports (git-archive) code from 开发者_如何转开发local repository to folder.
    2. Compresses folder and uploads backup of code to S3 with timestamp attached to the file name.
    3. Replaces code in /var/www/ with folder and gives appropriate permissions.
    4. Removes exported folder from home directory but leaves compressed file intact with containing the latest code. 5 Start all slave instances. On startup they run a script:
    5. Apache does not start until it's triggered.
    6. Use scp (Secure copy) to copy latest compressed code from master to /tmp/www
    7. Extract code and replace /var/www/ and give appropriate permissions.
    8. Start Apache.

I would provide code examples but they are very incomplete and I need more time. I also want to get all my assets (css/js/img) being automatically being pushed to s3 so they can be distibutes to clients via CloudFront.


EBS is like a harddrive you can attach to one instance, basically a 1:1 mapping. S3 is the only shared storage stuff in AWS, otherwise you will need to setup an NFS server or similar.

What you can do is put all your php files on s3 and then sync them down to a new instance when you start it.

I would recommend bundling a custom AMI with everything you need installed (apache, php, etc) and setup a cron job to sync php files from s3 to your document root. Your workflow would be, upload files to s3, let server cron sync files.

The rest of your setup seems pretty standard.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜