[Django]-What is robots.txt warning in django and advise to handle this?

5👍

robots.txt is a standard for web crawlers, such as those used by search engines, that tells them which pages they should index.

To resolve the issue, you can either host your own version of robots.txt statically, or use a package like django-robots.

It’s odd that you’re seeing the error in development unless you or your browser is trying to explicitly access it.

In production, if you’re concerned with SEO, you’ll likely also want to set up the webmaster tools with each search engine , example: Google Webmaster Tools

https://en.wikipedia.org/wiki/Robots_exclusion_standard

https://support.google.com/webmasters/answer/6062608?hl=en

👤whp

0👍

robots.txt is a file that is used to manage behavior of crawling robots (such as search index bots like google). It determines which paths/files the bots should include in it’s results. If things like search engine optimization are not relevant to you, don’t worry about it.

If you do care, you might want to use a django native implementation of robots.txt file management like this.

👤MrName

0👍

the robots.txt file is a Robots exclusion standard, please see THIS for more informtion.

Here is an example of Google’s robots.txt: https://www.google.com/robots.txt

For a good example of how to set one up, use What are recommended directives for robots.txt in a Django application?, as reference.

Leave a comment