Googlebot is accessing .aspx pages, it should access SEO-friendly URLs only
Googlebot is accessing .aspx
pages in my website e.g http://mysite.com/thepage.apx?id=32
I have used Intellgencia URL module for SEO-friendly URLs and my website also has links that present friendly URLs (or extensionless URLs) e.g.: http://mysite.com/thepage/32.
However, googlebot i开发者_开发技巧s accessing original URLs (http://mysite.com/thepage.apx?id=32).
In the robot.txt file I have placed following code:
disallow: *.aspx
My question is whether it will hide all pages with .aspx
pages and friendly URLs (pointing to same .aspx
pages) or hide pages just with .aspx
extensions.
Summary:
googlebot is accessing same page with two urls:- With original path e.g somesite/thepage.aspx?id=xx
- With friendly URL somesite/somepage/xx I want googlebot to access only friendly URLs. I am using asp.net 2.0 and friendly urls are mapped in
web.config
using intellgencia dll.
Code in web.config with Intellgencia module.
<rewriter>
<rewrite url="/category/(.+)" to="/categoryPage.aspx?Id=$1"/></rewrite>
</rewriter>
I think your url mapping is wrong. Your aspx urls are supposed to be mapped too.
I think there are typos in your Disallow
statement - it should read Disallow: /*.aspx$
- see this aricle http://www.google.com/support/webmasters/bin/answer.py?answer=156449 on Google Webmaster tools and scroll down to the section on Pattern matching.
精彩评论