68👍
Yes, robots.txt should not be served by Django if the file is static. Try something like this in your Nginx config file:
location /robots.txt {
alias /path/to/static/robots.txt;
}
See here for more info: https://nginx.org/en/docs/http/ngx_http_core_module.html#alias
Same thing applies to the favicon.ico file if you have one.
The equivalent code for Apache config is:
Alias /robots.txt /path/to/static/robots.txt
13👍
I know this is a late reply, I was looking for similar solution when don’t have access to the web server config. So for anyone else looking for a similar solution, I found this page: http://www.techstricks.com/adding-robots-txt-to-your-django-project/
which suggests adding this to your project url.py:
from django.conf.urls import url
from django.http import HttpResponse
urlpatterns = [
#.... your project urls
url(r'^robots.txt', lambda x: HttpResponse("User-Agent: *\nDisallow:", content_type="text/plain"), name="robots_file"),
]
which I think should be slightly more efficient that using a template file, although it could make your url rules untidy if need multiple ‘Disallow:’ options.
- [Django]-Programmatically create a django group with permissions
- [Django]-Dynamically add field to a form
- [Django]-Getting Django admin url for an object