How and where to add a robots.txt file to an ASP.net web application?
I am using ASP.net with C#.
To increase the searchability of my site in Google, I have searched &开发者_开发知识库amp; found out that I can do it by using my robots.txt, but I really don't have any idea how to create it and where can I place my tag like ASP.net
, C#
in my txt file.
Also, please let me know the necessary steps to include it in my application.
robots.txt is a text file in the root folder that sets certain rules for the search robots, mainly which folders to access and what not. You can read more about it here: http://www.robotstxt.org/robotstxt.html
The robots.txt file is placed at the root of your website and is used to control where search spiders are allowed to go, e.g., you may not want them in your /js
folder. As usual, wikipedia has a great write up
I think you may find SiteMaps more useful though. This is an XML file which you produce representing the content of your site. You then push this to the main search engines. Although started by Google all the main search engines have now agreed to follow a standard schema.
Increasing your Google score, and SEO in general, isn't something I've know much about. It sounds like a black art to me :) Check out the IIS SEO Toolkit though, it may offer some pointers.
Most search engines will index your site unless a robots.txt tells it not to. In other words, robots.txt is generally used to exclude robots from your site.
精彩评论