crawler.robotAgent

Background

This is a string that is used to examine any robots.txt files that are found to see if there are Funnelback specific restrictions.

The string is case-insensitive, so it will match

User-Agent: funnelback
User-Agent: FunnelBACK

The user-agent check is equals, not a substring so it will not match

User-Agent: funnelback crawler
The webcrawler will still obey all rules in the robots.txt file which are meant to apply to all webcrawlers regardless of name.

Setting the key

Set this configuration key in the search package or data source configuration.

Use the configuration key editor to add or edit the crawler.robotAgent key, and set the value. This can be set to any valid String value.

Default value

crawler.robotAgent=Funnelback