[Fixed]-Django : Prevent search engine indexing of internal pages/links

1👍

What you’re looking for is the robots.txt file, to hide it from robots:

The robots exclusion standard, also known as the robots exclusion
protocol or robots.txt protocol, is a standard used by websites to
communicate with web crawlers and other web robots. The standard
specifies the instruction format to be used to inform the robot about
which areas of the website should not be processed or scanned. Robots
are often used by search engines to categorize and archive web sites,
or by webmasters to proofread source code.

https://en.wikipedia.org/wiki/Robots_exclusion_standard

To not allow other people to enter, you make a check to see if it’s your ip connecting to the view, otherwise you return a 404 or whatever you want. (you can check that in the request’s meta)

👤Fabio

Leave a comment