Robots.txt files are simple plain text files that can be easily created using simple text editors like notepad in windows. To prevent a robot from accessing your site you need only two rules in your robots.txt file:
- User-agent: the bot which the following rule applies to.
- Disallow: the address you want to block.
Please note that each subdomain of your site should have its own robots.txt file.
To create your robots.txt follow these steps:
1 – Open up a text editor
You can use Notepad in Windows or TextEdit in Mac OSX.
2 – Specify directives
Write only one directive per line. There must be no blank line between the User-agent line and the last Disallow. User-agent/Disallow groups must be separated by a blank line. If you’re not sure about the syntax of your directives read my Robots.txt Syntax guide.
Example of a robots.txt:
User-agent: * Disallow: /wp-admin/ Disallow: /images/ Disallow: /wp-content/plugins/ User-agent: WebReaper Disallow: /
3 – Save and upload
Save your file in plain text format and rename it to robots.txt. Make sure the extension of the file is (.txt). You have to place your sitemap in the root directory of your website.