开发者

Web crawlers and non-ASCII characters in sitemap.xml

One of our sites has non-ASCII (non-english) characters in URLs:

http://example.com/kb/начало-работ开发者_如何学运维ы/оплата

I wonder how do web crawlers (particularly Googlebot) handle these situations? Do these URLs have to be encoded or otherwise processed?


I think it is best url-encoded. This is the standard.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜