- Name: perl-WWW-RobotRules
- Version: 6.10.0
- Release: 1.mga1
- Epoch:
- Group: Development/Perl
- License: GPL+ or Artistic
- Url: http://search.cpan.org/dist/WWW-RobotRules
- Summary: Parse /robots.txt file
- Architecture: noarch
- Size: 10772
- Distribution: Mageia
- Vendor: Mageia.Org
- Packager: Mageia Team <http://www.mageia.org>
Description:
This module parses _/robots.txt_ files as specified in "A Standard for
Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters
can use the _/robots.txt_ file to forbid conforming robots from accessing
parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed _/robots.txt_
files on any number of hosts.
The following methods are provided:
- BuildArch:
- ExcludeArch:
- ExclusiveArch:
- Cookie: ecosse 1300106791
- Buildhost: ecosse
Generated packages:
Other version of this rpm: