开发者

Are there any good tools to generate a Google Sitemap? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
开发者_如何转开发

Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.

Closed 8 years ago.

Improve this question

Can you recommend any tools? Should we build our own? Should we create the sitemap manually?


The Google Sitemap Generator for IIS generates a sitemaps based on actual HTTP requests to your server (unlike other sitemap generators that rely on a crawlable path from the homepage, Google's approach doesn't actually crawl your site).

It is uniquely suited to dynamic applications, particularly those that have a deep bank of data that's surfaced through user queries alone.


I have personally used Google's sitemapgen, a Python script, which automatically generates the sitemap according to an XML configuration file and a url list.

There also seems to be a newer tool called googlesitemapgenerator, which according to its website is newer and supports more formats:

Google previously released sitemapgen, a Python-based tool, to Sourceforge. In comparison to sitemapgen, Google Sitemap Generator is a next-generation tool that relies on web server filtering rather than crawling, provides enhanced features, and supports more formats.


I always used this one XML-Sitemap. It's an online site thou, not a standalone application..


I would recommend you build your own if you have this ability. A sitemap should include all the files you want to be crawled and this is not always every file in the site. An automated downloadable script will likely require a decent amount of configuration to address content you do not want listed in the sitemap. Unless you want every file on the site spider-ed, in this case perhaps one of the scripts listed are a good option. Myself, I put a bit more effort into SEO typically and details like controlling what pages are being submitted and how is important to me.


the big question is: how big is your site: is it <3000 pages you propably do not need any sitemap at all if all pages are linked on your site. is it <50000 you can use one of the many scripts on the internet. if it's bigger then >50000 you should make your own sitemap.xml. because then you are in the distribution SEO business where you need absolute control over your site and what you communicate to google (and when), because then you can controll: ok, i submitted google 25.000 pages, he crawled 99% of them and indexed (according to google webmaster tools) 30%, i get X visits from it, lets add another 25.000 ... and so on.


GSiteCrawler is one I've used in the past and it's done me good.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜