The crawler will ignore a URL if it matches any of these exclude patterns.
Can be set in: collection.cfg
This option is a comma-separated list of substrings that is used by the crawler to determine if it will process a web page. If the page’s URL contains one of the substrings the crawler will not process the page.
Use of this option will allow the search administrator to exclude an individual web page or to exclude a site as a whole.
See: include and exclude patterns for a description on how include and exclude patterns work and advanced techniques such as the use of a regular expression exclude pattern.
Ignore standard paths, plus a local sales folder.