How should I handle pages that move to a new url with regards to search engines?
I have done some refactoring on a asp.net mvc application already deployed to a live web site. Among the refactoring was moving functionality to a new controller, causing some urls to change. Shortly after the various search engine robots start hammering the old urls.
What is the right way to handle this in general?
- Ignore it? In time the SEs should find out that they get nothing but 400 from the old urls.
- Block old urls with robots.txt?
- Continue to catch the old urls, then redirect to new ones? Users navigating the site would never get the redirection as the urls are updated through-out the new version of the site. I see it as garbage code - unless it could be handled by some fancy routing?
- Other?
As always, all comment开发者_运维问答s welcome...
Thanks, Anders, Denmark
Here is a very good article on this exact subject: http://www.codeproject.com/KB/aspnet/webformmvcharmony.aspx. There is a section called "Handling Legacy URLs". The beauty of the approch is that existing users who have bookmarked the old url can still use their old links, but a redirect is sent to their browser with a "301 Moved Permanently" code which tells the browser that a redirect is ocurring. It is up to the browser to use this code or not and support for it varies, but whatever happens, the user ill see the new MVC version of your page.
The answer depends on how important usability and SEO are to you and your site. I have added 301 redirects for old routes by adding the old action methods back in and doing 301 redirects to the new URLs. You might also consider re-submitting your sitemap to the search engines if you are concerned with SEO.
精彩评论