Laravel 5 robots.txt Route
A route for serving a basic robots.txt in Laravel 5.1+, based on configuration settings.
This is a fork of ellisthedev/laravel-5-robots,
which was a fork of jayhealey/Robots,
which was based on earlier work., (*1)
The purpose of this fork is to create a set-it-and-forget-it package that can be
installed without much effort. It is therefore highly opinionated and not built
for configuration., (*2)
When enabled, it allows access to all clients and serves up the sitemap.xml.
Otherwise, it operates almost identically to Laravel's default configuration,
denying access to all clients., (*3)
Installation
Step 1: Composer
Via Composer command line:, (*4)
$ composer require infusionweb/laravel-robots-route
Or add the package to your composer.json
:, (*5)
{
"require": {
"infusionweb/laravel-robots-route": "~0.1.0"
}
}
Step 2: Remove the existing robots.txt
Laravel ships with a default robots.txt which disallows all clients. It needs to be removed for the configured route to work., (*6)
$ rm public/robots.txt
Step 3: Enable the route
Add the service provider to your config/app.php
:, (*7)
'providers' => [
//
InfusionWeb\Laravel\Robots\RobotsServiceProvider::class,
];
Publish the package config file:, (*8)
$ php artisan vendor:publish --provider="InfusionWeb\Laravel\Robots\RobotsServiceProvider"
You may now allow clients via robots.txt by editing the config/robots.php
file, opening up the site to search engines:, (*9)
return [
'allow' => env('ROBOTS_ALLOW', true),
];
Or simply setting the the ROBOTS_ALLOW
environment variable to true, via the Laravel .env
file or hosting environment., (*10)
ROBOTS_ALLOW=true
Credits
License
The MIT License (MIT). Please see License File for more information., (*11)