Prevent Safari from Caching Top Sites
I have found what I think to be a big security flaw with Safari's Top Sites image cache. For those who don't know, Safari basically takes a snapshot of every page you visit and this is supposed to be somehow useful开发者_开发知识库 to the user. My problem with it is that it even takes snapshots of parts of my site that are password protected. So if someone got a hold of one of my user's computers and knew where to look they'd have a whole repository of content that should theoretically be for my user's eyes only.
I tested this "feature" out with other sites and found that sites such as Gmail and hotmail don't get cached. Well, only the hotmail login page gets cached but nothing beyond that, and none of Gmail gets cached at all. So my question is this: What can I do on my end to prevent this from happening? I've already prevented normal caching in FF, IE, Opera, etc. by using
header('Cache-control: no-store, no-cache')
along with pragma: no-cache, and every other trick in the book to stop a page from being cached. What gives?
UPDATE: For anyone reading this: This is still an unresolved issue. I even contacted the Safari dev team and they just gave me the run around.
@Kieran Allen
This is the result of the code you asked me to include:
Headers sent
array(7) {
[0]=>
string(23) "X-Powered-By: PHP/5.2.2"
[1]=>
string(38) "Expires: Mon, 26 Jul 1997 05:00:00 GMT"
[2]=>
string(50) "Cache-Control: no-store, no-cache, must-revalidate"
[3]=>
string(16) "Pragma: no-cache"
[4]=>
string(40) "Cache-Control: post-check=0, pre-check=0"
[5]=>
string(44) "Last-Modified: Wed, 14 Jul 2010 09:32:56 GMT"
[6]=>
string(23) "Content-type: text/html"
}
My current theory is that Safari disables the snapshot on secure websites using https.
(I know it's not the correct protocol to respond with this as an answer to my question, but when I asked this I was not a registered member. I registered within the last hour though, and thought I would be able to claim this question as my own but alas Stack Overflow does not allow this.)
This site is all you want
if ( $_SERVER["HTTP_X_PURPOSE"] == "preview" ) {
// Do something for safari top sites
} else {
// Do something for all navigators
};
Have you tried appending a random query string?
Headers which should work (all together):
// A really far back date...
header('Expires: Mon, 26 Jul 1997 05:00:00 GMT');
// Dynamic Modified date
header('Last-Modified: ' . gmdate('D, d M Y H:i:s') . ' GMT');
// HTTP 1.0 (i think)
header('Pragma: no-cache');
// HTTP 1.1
header('Cache-Control: no-store, no-cache, must-revalidate');
header('Cache-Control: post-check=0, pre-check=0', false);
EDIT:
Try putting this for some debugging AFTER you send the above headers.
if (headers_sent())
{
echo '<h1>Headers sent</h1>';
echo '<pre>';
var_dump(headers_list());
echo '</pre>';
}
else
{
echo '<h1>Headers not sent</h1>';
echo '<pre>';
var_dump(headers_list());
echo '</pre>';
}
Can you edit your post with the output of the above?
Thanks!
Sending 2 cache control headers probably is not helping - try merging them. I'd also recommend a 'Vary: Cookie' header as good practice for authenticated pages.
Have you looked at the headers you get from the hotmail and google pages which you say are not cached?
C.
精彩评论