Step 3: Create a robots.txt file
Overview
A robots.txt
file is an industry-standard file used to control a 'spider' or 'robot' crawler.
You create a robots.txt
file in your Matrix instance. It is used to prevent a crawler from accessing all or some of your websites.
Create the file
Setting up a robots.txt
file is covered in the Matrix documentation.
To find out how and where to create your robots.txt
file, read Create a robots.txt file.