- Name: perl-WWW-RobotRules
- Version: 6.02
- Release: 1
- Epoch:
- Group: Development/Languages/Perl
- License: GPL v1+ or Artistic
- Url: http://search.cpan.org/dist/WWW-RobotRules/
- Summary: WWW::RobotRules - database of robots.txt-derived permissions
- Architecture: noarch
- Size: 21051
- Distribution: PLD
- Vendor: pld
- Packager:
Description:
This module parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>.
Webmasters can use the /robots.txt file to forbid conforming robots
from accessing parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The
same WWW::RobotRules object can be used for one or more parsed
/robots.txt files on any number of hosts.
- OptFlags: -O2 -fwrapv -pipe -Wformat -Werror=format-security -gdwarf-4 -fno-debug-types-section -fvar-tracking-assignments -g2 -Wp,-D_FORTIFY_SOURCE=2 -fstack-protector --param=ssp-buffer-size=4 -fPIC -march=x86-64
- Cookie:
- Buildhost: ymir-builder