2017 © Pedro Peláez
 

library php-robots-txt

PHP Robots.txt Utilities

image

hugsbrugs/php-robots-txt

PHP Robots.txt Utilities

  • Saturday, April 8, 2017
  • by hugsbrugs
  • Repository
  • 1 Watchers
  • 0 Stars
  • 10 Installations
  • PHP
  • 0 Dependents
  • 0 Suggesters
  • 0 Forks
  • 2 Open issues
  • 1 Versions
  • 0 % Grown

The README.md

php-robots-txt

This librairy provides utilities function to ease robots.txt manipulation. If you want to check if URLs respect robots.txt policy with optional cache then it's your lucky day ;), (*1)

Build Status Coverage Status, (*2)

Install

Install package with composer, (*3)

composer require hugsbrugs/php-robots-txt

In your PHP code, load library, (*4)

require_once __DIR__ . '/../vendor/autoload.php';
use Hug\Robots\Robots as Robots;

Usage

Returns if a page is accessible by respecting robots.txt policy. Optionaly pass a user agent to also check against UA policy., (*5)

Robots::is_allowed($url, $user_agent = null);

With this simple method a call to remote robots.txt will be fired on each request. To enable a cache define following variables, (*6)

define('HUG_ROBOTS_CACHE_PATH', '/path/to/robots-cache/');
define('HUG_ROBOTS_CACHE_DURATION', 7*86400);

Cache in seconds (86400: 1 day) Don't forget to make your path writable by webserver user robots.txt files are gzcompressed to save disk space, (*7)

You Should not need following methods unless you want to play with code and tweak it !, (*8)

Robots::download_robots($url, $user_agent);
Robots::get_robots($url, $user_agent);
Robots::is_cache_obsolete($file);
Robots::empty_cache();

Unit Tests

composer exec phpunit

Author

Hugo Maugey visit my website ;), (*9)

The Versions

08/04 2017

dev-master

9999999-dev

PHP Robots.txt Utilities

  Sources   Download

The Requires

 

The Development Requires