'django serving robots.txt efficiently

Here is my current method of serving robots.txt

url(r'^robots\.txt/$', TemplateView.as_view(template_name='robots.txt',
                                            content_type='text/plain')),

I don't think that this is the best way. I think it would be better if it were just a pure static resource and served statically. But the way my django app is structured is that the static root and all subsequent static files are located in

http://my.domain.com/static/stuff-here

Any thoughts? I'm amateur at django but

 TemplateView.as_view(template_name='robots.txt',
                                  content_type='text/plain')

looks a lot more resource consuming than just a static call to my static directory which is served on nginx.



Solution 1:[1]

I know this is a late reply, I was looking for similar solution when don't have access to the web server config. So for anyone else looking for a similar solution, I found this page: http://www.techstricks.com/adding-robots-txt-to-your-django-project/

which suggests adding this to your project url.py:

from django.conf.urls import url
from django.http import HttpResponse

urlpatterns = [
    #.... your project urls
    url(r'^robots.txt', lambda x: HttpResponse("User-Agent: *\nDisallow:", content_type="text/plain"), name="robots_file"),
]

which I think should be slightly more efficient that using a template file, although it could make your url rules untidy if need multiple 'Disallow:' options.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1