Creating a robots.txt file

Matrix allows you to create a robots.txt file to restrict access to sections of your site by robot (or spider) programs.

The permissions that you set for your site also restrict access by robot programs. For example, if you have an intranet site or a member’s area for which public read access has not been granted, Matrix will not allow access to these areas by robot programs.

The steps to create a robots.txt file are as follows:

  1. Create a text file asset in the site’s root directory to which you wish to restrict access.

  2. You can then either upload a pre-existing robots.txt file or use the Edit text screen of the text file asset and enter the appropriate configuration. The syntax of a robots.txt file can be found at http://www.robotstxt.org/. An example entry to restrict access by all robots to all areas of your site is shown:

User-agent: *
Disallow: /
  1. Clear Allow unrestricted access on the Details screen of the text file asset.

  2. Set the text file asset status to live and ensure that you have granted public read access. Once you’ve created your robots.txt file, you can test it to see if it’s configured correctly using various online tools such as: