Not indexing parts of HTML
Is there any method for limiting the indexing 开发者_开发知识库of HTML to increase the page content relevancy? E.g. excluding menus, etc. from robots. I remember seeing some special tags for this long time ago, but I couldn't find the information anymore.
How does search engines support such methods (Google/Bing)?
[include Nicholas Wilson's answer here]
The only hint I've found was "Russian search engine Yandex introduce a new tag which only prevents indexing of the content between the tags, not a whole Web page."
http://en.wikipedia.org/wiki/Noindex#Russian_version
Put the junk at the bottom of the page. Besides that, not much. Serve the same content to search spiders and browsers, and they'll work it out. There's no magic markup (which search engines use) that does what you want (though by all means, use @role
and nav
if you like). Menus can often go after the main content, and that anecdotally helps somewhat.
精彩评论