Specifies a limit on the number of files from a single directory or dynamically generated URLs that will be crawled.
Can be set in: collection.cfg
This option sets the limit for the number of files within an area. Here "area" is defined as either
a static directory or a generator e.g.
Note: This parameter was previously called
crawler.max_dir_size - the name was changed to show that
generators are also included in this definition.
If the crawler encounters an area on a site with the URL:
and downloads multiple files from within this area/directory then it will stop downloading any further content from this directory once the specified limit is reached.
A similar approach is used for generators e.g. if we encounter a generator like:
and have downloaded multiple URLs generated by this index.asp script then the crawler will download no more from this generator when the limit is reached.
Lotus Notes generator scripts (.nsf) look like directories e.g.
In this example if "publish.nsf" generates more than the limit we will not request more content from it, even though from the URL it looks like there are other directories or areas underneath it.
Note: If you are trying to crawl a dynamically generated site which has a lot of content generated from a single generator then you may need to increase the default value for this parameter if you are not getting back as much content as you expect.