My Books

Text Processing with JavaScript

Solve complex text validation, extraction, and modification problems efficiently in JavaScript.

Modern Async JavaScript

Delve into async features of JavaScript from ES2020 through ESNext. Start building custom asynchronous iterators and generators, and more for fast, lean code.

JavaScript Brain Teasers

Ready to test your JavaScript skills? Challenge yourself with these brain-teasing puzzles, supercharge your learning journey, and emerge as a JavaScript pro.

Creating a robots.txt

By placing a robots.txt file in your root directory you can easily prevent robots from accessing your website. This file can be easily created using simple text editors like Notepad in Windows or TextEdit in Mac OSX.

Robots.txt files are simple plain text files that can be easily created using simple text editors like notepad in windows. To prevent a robot from accessing your site you need only two rules in your robots.txt file:

  • User-agent: the bot which the following rule applies to.
  • Disallow: the address you want to block.

Please note that each subdomain of your site should have its own robots.txt file.

To create your robots.txt follow these steps:

1 – Open up a text editor

You can use Notepad in Windows or TextEdit in Mac OSX.

2 – Specify directives

Write only one directive per line. There must be no blank line between the User-agent line and the last Disallow. User-agent/Disallow groups must be separated by a blank line. If you’re not sure about the syntax of your directives read my Robots.txt Syntax guide.

Example of a robots.txt:

User-agent: *
Disallow: /wp-admin/
Disallow: /images/
Disallow: /wp-content/plugins/ 

User-agent: WebReaper
Disallow: /

3 – Save and upload

Save your file in plain text format and rename it to robots.txt. Make sure the extension of the file is (.txt). You have to place your sitemap in the root directory of your website.

Have your say