Robots txt file

How can I have my own robots.txt file?

A robots.txt file tells internet search engines to search or not search certain areas of a website.

These files are only looked for in the top level of a webserver.

We can't let people change the one for www.cse.unsw.edu.au, but there is a way to get your own "top level" web area.

You should set your homepage up to only operate from the username.web.cse.unsw.edu.au address, as described in Personal Domain.

Then, if you put a robots.txt file in your public_html directory, it will appear at the top level of that domain.

Last edited by jbc 17/04/2020

Downloads on this page:

Tags for this page:

robots, web