2017 © Pedro Peláez
 

library laravel-robotstxt

Set the robots.txt content dynamically based on the Laravel app environment.

image

gverschuur/laravel-robotstxt

Set the robots.txt content dynamically based on the Laravel app environment.

  • Monday, April 23, 2018
  • by verschuur
  • Repository
  • 2 Watchers
  • 3 Stars
  • 37 Installations
  • PHP
  • 0 Dependents
  • 0 Suggesters
  • 1 Forks
  • 1 Open issues
  • 4 Versions
  • 6 % Grown

The README.md

Run tests Code Climate issues Code Climate maintainability Scrutinizer, (*1)

Dynamic robots.txt ServiceProvider for Laravel 🤖

Installation

Composer

composer require verschuur/laravel-robotstxt

Manual

Add the following to your composer.json and then run composer install., (*2)

{
    "require": {
        "verschuur/laravel-robotstxt": "^3.0"
    }
}

Service provider registration

This package supports Laravel's service provider autodiscovery so that's it. If you wish to register the package manually, add the ServiceProvider to the providers array in config/app.php., (*3)

Verschuur\Laravel\RobotsTxt\Providers\RobotsTxtProvider::class

Usage

Basic usage

This package adds a /robots.txt route to your application. Remember to remove the physical robots.txt file from your /public dir or else it will take precedence over Laravel's route and this package will not work., (*4)

By default, the production environment will show, (*5)

User-agent: *
Disallow:

while every other environment will show, (*6)

User-agent: *
Disallow: /

This will allow the default install to allow all robots on a production environment, while disallowing robots on every other environment., (*7)

Custom settings

If you need custom sitemap entries, publish the configuration file, (*8)

php artisan vendor:publish --provider="Verschuur\Laravel\RobotsTxt\Providers\RobotsTxtProvider"

This will copy the robots-txt.php config file to your app's config folder. In this file you will find the following array structure, (*9)

'environments' => [
    '{environment name}' => [
        'paths' => [
            '{robot name}' => [
                'disallow' => [
                    ''
                ],
                'allow' => []
            ],
        ]
    ]
]

In which:, (*10)

  • {environment name}: the enviroment for which to define the paths.
  • {robot name}: the robot for which to define the paths.
  • disallow: all entries which will be used by the disallow directive.
  • allow: all entries which will be used by the allow directive.

By default, the environment name is set to production with a robot name of * and a disallow entry consisting of an empty string. This will allow all bots to access all paths on the production environment., (*11)

Note: If you do not define any environments in this configuration file (i.e. an empty configuration), the default will always be to disallow all bots for all paths., (*12)

Examples

For brevity, the environment array key will be disregarded in these examples., (*13)

Allow all paths for all robots on production, and disallow all paths for every robot in staging., (*14)

'production' => [
    'paths' => [
        '*' => [
            'disallow' => [
                ''
            ]
        ]
    ]
],
'staging' => [
    'paths' => [
        '*' => [
            'disallow' => [
                '/'
            ]
        ]
    ]
]

Allow all paths for all robot bender on production, but disallow /admin and /images on production for robot flexo, (*15)

'production' => [
    'paths' => [
        'bender' => [
            'disallow' => [
                ''
            ]
        ],
        'flexo' => [
            'disallow' => [
                '/admin',
                '/images'
            ]
        ]
    ]
],

Allow directive

Besides the more standard disallow directive, the allow directive is also supported., (*16)

Allow a path, but disallow sub paths:, (*17)

'production' => [
    'paths' => [
        '*' => [
            'disallow' => [
                '/foo/bar'
            ],
            'allow' => [
                '/foo'
            ]
        ]
    ]
],

When the file is rendered, the disallow directives will always be placed before the allow directives., (*18)

If you don't need one or the other directive, and you wish to keep the configuration file clean, you can simply remove the entire key from the entire array., (*19)

Sitemaps

This package also allows to add sitemaps to the robots file. By default, the production environment will add a sitemap.xml entry to the file. You can remove this default entry from the sitemaps array if you don't need it., (*20)

Because sitemaps always need to an absolute url, they are automatically wrapped using Laravel's url() helper function. The sitemap entries in the config file should be relative to the webroot., (*21)

The standard production configuration

'environments' => [
    'production' => [
        'sitemaps' => [
            'sitemap.xml'
        ]
    ]
]

Adding multiple sitemaps

'environments' => [
    'production' => [
        'sitemaps' => [
            'sitemap-articles.xml',
            'sitemap-products.xml',
            'sitemap-etcetera.xml'
        ]
    ]
]

Compatiblility

This package is compatible with Laravel 9, 10, 11 and 12. For a complete overview of supported Laravel and PHP versions, please refer to the 'Run test' workflow., (*22)

Testing

PHPUnit test cases are provided in /tests. Run the tests through composer run test or vendor/bin/phpunit --configuration phpunit.xml., (*23)

robots.txt reference

The following reference was while creating this package:, (*24)

https://developers.google.com/search/reference/robots_txt, (*25)

The Versions

23/04 2018

dev-master

9999999-dev

Set the robots.txt content dynamically based on the Laravel app environment.

  Sources   Download

MIT

The Requires

 

by Govert Verschuur

laravel php robots.txt

23/04 2018

v1.1.1

1.1.1.0

Set the robots.txt content dynamically based on the Laravel app environment.

  Sources   Download

MIT

The Requires

 

by Govert Verschuur

laravel php robots.txt

16/01 2017

v1.1.0

1.1.0.0

Set the robots.txt content dynamically based on the Laravel app environment.

  Sources   Download

MIT

The Requires

 

by Govert Verschuur

laravel php robots.txt

05/10 2016

v1.0.0

1.0.0.0

Set the robots.txt content dynamically based on the Laravel app environment.

  Sources   Download

MIT

The Requires

 

by Govert Verschuur

laravel php robots.txt