开发者

How do I display protected Amazon S3 images on my secure site using PHP?

I am trying to move images for my site from my host to Amazon S3 cloud hosting. These images are of client work sites and cannot be publicly available. I would like them to be displayed on my site preferably by using the PHP SDK available from Amazon.

开发者_StackOverflow中文版

So far I have been able to script for the conversion so that I look up records in my database, grab the file path, name it appropriately, and send it to Amazon.

    //upload to s3
$s3->create_object($bucket, $folder.$file_name_new, array(
    'fileUpload' => $file_temp,
    'acl' => AmazonS3::ACL_PRIVATE, //access denied, grantee only own
    //'acl' => AmazonS3::ACL_PUBLIC, //image displayed
    //'acl' => AmazonS3::ACL_OPEN, //image displayed, grantee everyone has open permission
    //'acl' => AmazonS3::ACL_AUTH_READ, //image not displayed, grantee auth users has open permissions
    //'acl' => AmazonS3::ACL_OWNER_READ, //image not displayed, grantee only ryan
    //'acl' => AmazonS3::ACL_OWNER_FULL_CONTROL, //image not displayed, grantee only ryan
    'storage' => AmazonS3::STORAGE_REDUCED
    )
    );

Before I copy everything over, I have created a simple form to do test upload and display of the image. If I upload an image using ACL_PRIVATE, I can either grab the public url and I will not have access, or I can grab the public url with a temporary key and can display the image.

<?php
//display the image link
$temp_link = $s3->get_object_url($bucket, $folder.$file_name_new, '1 minute');
?>
<a href='<?php echo $temp_link; ?>'><?php echo $temp_link; ?></a><br />
<img src='<?php echo $temp_link; ?>' alt='finding image' /><br />

Using this method, how will my caching work? I'm guessing every time I refresh the page, or modify one of my records, I will be pulling that image again, increasing my get requests.

I have also considered using bucket policies to only allow image retrieval from certain referrers. Do I understand correctly that Amazon is supposed to only fetch requests from pages or domains I specify?

I referenced: https://forums.aws.amazon.com/thread.jspa?messageID=188183&#188183 to set that up, but then am confused as to which security I need on my objects. It seemed like if I made them Private they still would not display, unless I used the temp link like mentioned previously. If I made them public, I could navigate to them directly, regardless of referrer.

Am I way off what I'm trying to do here? Is this not really supported by S3, or am I missing something simple? I have gone through the SDK documentation and lots of searching and feel like this should be a little more clearly documented so hopefully any input here can help others in this situation. I've read others who name the file with a unique ID, creating security through obscurity, but that won't cut it in my situation, and probably not best practice for anyone trying to be secure.


The best way to serve your images is to generate a url using the PHP SDK. That way the downloads go directly from S3 to your users.

You don't need to download via your servers as @mfonda suggested - you can set any caching headers you like on S3 objects - and if you did you would be losing some major benefits of using S3.

However, as you pointed out in your question, the url will always be changing (actually the querystring) so browsers won't cache the file. The easy work around is simply to always use the same expiry date so that the same querystring is always generated. Or better still 'cache' the url yourself (eg in the database) and reuse it every time.

You'll obviously have to set the expiry time somewhere far into the future, but you can regenerate these urls every so often if you prefer. eg in your database you would store the generated url and the expiry date(you could parse that from the url too). Then either you just use the existing url or, if the expiry date has passed, generate a new one. etc...


You can use bucket policies in your Amazon bucket to allow your application's domain to access the file. In fact, you can even add your local dev domain (ex: mylocaldomain.local) to the access list and you will be able to get your images. Amazon provides sample bucket policies here: http://docs.aws.amazon.com/AmazonS3/latest/dev/AccessPolicyLanguage_UseCases_s3_a.html. This was very helpful to help me serve my images.

The policy below solved the problem that brought me to this SO topic:

    {
       "Version":"2008-10-17",
       "Id":"http referer policy example",
       "Statement":[
    {
      "Sid":"Allow get requests originated from www.example.com and example.com",
      "Effect":"Allow",
      "Principal":"*",
      "Action":"s3:GetObject",
      "Resource":"arn:aws:s3:::examplebucket/*",
      "Condition":{
        "StringLike":{
          "aws:Referer":[
            "http://www.example.com/*",
            "http://example.com/*"
          ]
        }
      }
    }
  ]
}


When you talk about security and protecting data from unauthorized users, something is clear: you have to check every time you access that resource that you are entitled to.

That means, that generating an url that can be accessed by anyone (might be difficult to obtain, but still...). The only solution is an image proxy. You can do that with a php script.

There is a fine article from Amazon's blog that sugests using readfile, http://blogs.aws.amazon.com/php/post/Tx2C4WJBMSMW68A/Streaming-Amazon-S3-Objects-From-a-Web-Server

readfile('s3://my-bucket/my-images/php.gif');


You can download the contents from S3 (in a PHP script), then serve them using the correct headers.

As a rough example, say you had the following in image.php:

$s3 = new AmazonS3();
$response = $s3->get_object($bucket, $image_name);
if (!$response->isOK()) {
    throw new Exception('Error downloading file from S3');
}
header("Content-Type: image/jpeg");
header("Content-Length: " . strlen($response->body));
die($response->body);

Then in your HTML code, you can do

<img src="image.php">
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜