开发者

Max number of links per page

Had a conversation about sitemaps with someone from marketing. It was stated that a single page shouldn't have more than 100 links because Google will not follow more than 100 when crawling pages. I had not hear开发者_如何学God of this limit before.

I did some searching and found that Google's Webmaster Guidelines used to state "keep the links on a given page to a reasonable number (fewer than 100)." [2008] The Google Webmaster Guidelines now just state "keep the links on a given page to a reasonable number."

When engineering a sitemap architecture for a site of say 1,000 pages (or link list on any page for that matter) would it be acceptable to place all 1,000 links on a single sitemap page or should multiple sitemaps be used?

Also, does submitting an XML sitemap nullify the importance of an HTML sitemap to Google's spider? If so, then I would imagine only placing important links on the HTML sitemap instead of a link to every page to tailor the page to end-user usuability.


Depends on whether you're referring to sitemaps targeted towards users (which is what @adrian-k answered about) or sitemaps targeted towards robots (i.e. search engines).

If it's the second kind, then the answer is: you can (probably should) have several thousand links per page. It also pays to make life easier on your crawlers by including 'lastmod' values for your pages and by gzipping the page itself.

For information on valid formats for such sitemaps see http://www.sitemaps.org/protocol.php

Just to validate, take a look at what the big boys are doing. In most cases you'll find a reference to the sitemap page at the bottom of /robots.txt. For instance, http://www.linkedin.com/robots.txt or https://profiles.google.com/robots.txt

LinkedIn's sitemap, at http://partner.linkedin.com/sitemaps/smindex.xml.gz, lists another 2630 gzipped mini-sitemaps: curl http://partner.linkedin.com/sitemaps/smindex.xml.gz | gunzip | wc -l.

Google's lists 7104 such pages on their Google Profiles sitemap (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml) - curl http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml | grep '<loc>' | wc -l

Play around websites of SEO-aware members of your industry and you should find some more examples (or discover that you can beat them hands-down with this knowledge).


I would say a site map is there for users - not search engines, so yes it's acceptable (but might still present usability challenges).

A site map lays out a site in such a way that a person can quickly understand the structure and content of a given site, and help them get to what they want.

By saying that a search engine needs to be able to 'digest' an entire site map suggests that some content is only accessible via the site map - which should not be the case.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜