Gatsby plugin that automatically creates robots.txt for your site
The gatsby-plugin-robots-txt is a plugin that allows users to create a robots.txt file for their Gatsby sites during the build process. By using this plugin, users can easily control the behavior of crawlers on their websites.
To install the gatsby-plugin-robots-txt, you can use either yarn or npm. Here are the commands to install the plugin:
yarn add gatsby-plugin-robots-txt
or
npm install --save gatsby-plugin-robots-txt
After installation, you can configure the plugin in your gatsby-config.js
file to set up options like host, sitemap paths, policy rules, etc. External configurations can also be provided using an external config file.
The gatsby-plugin-robots-txt is a useful tool for Gatsby site owners to easily create and customize their robots.txt file during the build process. With features like custom host and sitemap settings, environment handling, and Netlify integration, this plugin provides a comprehensive solution for controlling crawler behavior on Gatsby sites.