开发者

Robots.txt: Disallow subdirectory but allow directory

I want to allow crawling of files in:

/directory/

but not crawling of files in:

/directory/subdirectory/

Is the correct robots.txt instruction:

User-agent: *
Disallow: /subdirectory/

I'm afraid that if I disallowed /directory/subdirectory/ that I would be di开发者_运维知识库sallowing crawling of all files in /directory/ which I do not want to do, so am I correct in using:

User-agent: *
Disallow: /subdirectory/


You've overthinking it:

User-agent: *
Disallow: /directory/subdirectory/

is correct.


User-agent: *
Disallow: /directory/subdirectory/

Spiders aren't stupid, they can parse a path :)

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜