开发者

Robots.txt is not serving

I am using Python with django framework for a web application. I have made a urls.py entry to serve the robots.txt file, however for some reason it seems like it's not able to serve it up.

url(r'^robots.txt$', 'support.views.robot_file',name="robot_file"),

this method works for sitemap.xml which has a very similar entry works

url(r'^sitemap.xml', 'support.views.sitemap_file',name="sitemap_开发者_JAVA技巧file"),

Which leads me to believe only serving robots.txt file is a problem as when I change it to serve robot.txt it works.

Can somebody could throw some pointers in as to why it would be happening?


You seem to be using apache - there's probably something in your apache config that is breaking robots.txt - maybe an "Alias /robots.txt /somewhere/that/doesn't/exist.txt" perhaps.


You misspelled robots.txt. If that is a copy-paste, that is your problem.

It also could be your web-server configuration is trying to server robots.txt from somewhere else. What specific error do you receive when you request robots.txt?

Also, format your code using ctrl-k or the code button in the WYSIWYG editor.


My working solution is like the following:

(r'^robots.txt$', 'my_project.views.robots'),

my view:

def robots(request):
    template = 'robots.txt'
    context = {}
    return render_to_response(template, context,
                           context_instance=RequestContext(request))


I think it's best for static files to be served by the webserver you're using. In my opinion it just creates an unneeded overhead for Django.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜