🤖 Robots.txt Generator
Create a robots.txt file to control search engine crawlers.
Advertisement
⚙️ Configuration
Bot-Specific Rules
📄 Generated robots.txt
Advertisement
About robots.txt
The robots.txt file tells search engine crawlers which pages they can or cannot request from your site. It should be placed at the root of your website (e.g., https://example.com/robots.txt).
Common Directives
- User-agent: Specifies which crawler the rules apply to (* means all)
- Disallow: Paths that crawlers should not access
- Allow: Paths that are allowed (overrides Disallow)
- Sitemap: Location of your XML sitemap
- Crawl-delay: Seconds to wait between requests (not supported by all crawlers)