fbpx Skip to main content
< Alle onderwerpen

Create a robot.txt file

Create a robot.txt file

A robot.xtx file is an ordinary text file. That’s why it also ends with txt. Your website should only have 1 robot.txt file. It is not that difficult to create such a file.

To begin, you should know that a robot.txt file has the following syntax. The syntax of the robots.txt file is as follows:

User-agent: [naam spider] Disallow: [naam bestand of directory]

Create a robot.txt file

Step 1

You create a new txt file. You can do this with your HTML editor.

Step 2

You enter the syntax listed above. So you start with User-agent: [de naam van je spider]. This spider allows you to choose parts of your website that you don’t want indexed.

Step 3

After Disallow: comes the location and name of the files you don’t want indexed. So you don’t allow this; you disallow.

There are many signs with meaning. For example, a slash (/) after disallow means you want to disallow access for all directories. A * after user-agent means you mean all robots. So a slash after disallow and a * after user-agent, means that all robots are allowed to visit all directories.

Here you can see an example.

User-agent: *
Disallow: /directory1/
Disallow: /directory2/
Disallow: /directory3/

The text between the stripes means all directories except directories 1, 2 and 3.

Thus, there are still many signs that have meaning. Of course, you need to know these. Fortunately, they are easy to look up on the Internet.