Can I add sites/default/files in robots.txt?
Customers in my site sometimes upload files in node which are stored in sites/default/files. Ideally I'd want the file to be hidden from the world. I was shocked one day to find Google listing one private file in its search results.
So can I add this directory in robots.t开发者_Go百科xt? Will it affect anything else? At least search engines will not bring those files open to the world!
If you add the uploads directory to robots.txt google will skip it.
This is how you can add stuff to robots.txt:
User-agent: *
Disallow: /folder/
Disallow: /folder/file.htm
精彩评论