- Name: perl-WWW-RobotRules
- Version: 6.10.0
- Release: 3
- Epoch:
- Group: Development/Perl
- License: GPL+ or Artistic
- Url: http://search.cpan.org/dist/WWW-RobotRules
- Summary: Parse /robots.txt file
- Architecture: noarch
- Size: 27199
- Distribution: Mandriva Linux
- Vendor: Mandriva
- Packager: Oden Eriksson <oeriksson@mandriva.com>
Description:
This module parses _/robots.txt_ files as specified in "A Standard for
Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters
can use the _/robots.txt_ file to forbid conforming robots from accessing
parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed _/robots.txt_
files on any number of hosts.
The following methods are provided:
- OptFlags: -O2 -Wa,--compress-debug-sections -gdwarf-4 -fvar-tracking-assignments -frecord-gcc-switches -Wstrict-aliasing=2 -pipe -Wformat -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -fstack-protector --param=ssp-buffer-size=4 -fPIC
- Cookie: n9.mandriva.com 1327281331
- Buildhost: n9.mandriva.com