Robots.txt: Disallow subdirectory but allow directory
I want to allow crawling of files in:
/directory/
but not crawling of files in:
/directory/subdirectory/
Is the correct robots.txt instruction:
User-agent: *
Disallow: /subdirectory/
I'm afraid that if I disallowed /directory/subdirectory/ that I would be di开发者_运维知识库sallowing crawling of all files in /directory/ which I do not want to do, so am I correct in using:
User-agent: *
Disallow: /subdirectory/
You've overthinking it:
User-agent: *
Disallow: /directory/subdirectory/
is correct.
User-agent: *
Disallow: /directory/subdirectory/
Spiders aren't stupid, they can parse a path :)
精彩评论