Robots.txt Generator
Web & SEO Tools
Generate robots.txt file
User-agent: * Disallow: /admin/ Disallow: /private/ Disallow: /cgi-bin/ User-agent: Googlebot Disallow: /admin/ Disallow: /private/ Disallow: /cgi-bin/ User-agent: Bingbot Disallow: /admin/ Disallow: /private/ Disallow: /cgi-bin/ # Allow public access to most content Sitemap: /sitemap.xml
How to Use
- 1. Enter your website URL
- 2. Configure which user agents to target (* for all)
- 3. Add paths you want search engines to avoid
- 4. Add your sitemap URLs
- 5. Copy or download the generated robots.txt file
- 6. Place the file in your website root directory
Example / Use Case
Create Robots.txt for E-commerce Site
An e-commerce website owner needs to prevent search engines from indexing checkout and admin pages while ensuring product pages are crawled.
Input
Domain: mystore.com | Block: /admin, /cart, /checkout | Allow: /products, /blog | Include Sitemap: yes
Output
User-agent: * Disallow: /admin/ Disallow: /cart/ Disallow: /checkout/ Allow: /products/ Allow: /blog/ Sitemap: https://mystore.com/sitemap.xml
How It Works
The robots.txt file uses the Robots Exclusion Protocol to communicate with web crawlers. Each directive consists of a user-agent line followed by rules. Key directives: Disallow specifies paths crawlers cannot access, Allow specifies accessible paths (used for allowing access to subdirectories within disallowed directories), and Sitemap points to your XML sitemap location. Common user-agents include Googlebot, Bingbot, or * for all bots. The file must be placed in the root domain. Remember: robots.txt controls crawling, not indexing. For indexing control, use noindex meta tags. For creating XML sitemaps, use our Sitemap XML Generator.
How to Use
- 1Enter your website domain URL
- 2Select common crawl rules (allow/disallow)
- 3Add custom disallow paths for admin, login, or private pages
- 4Optionally add your sitemap URL
- 5Download or copy the generated robots.txt file
- 6Upload the file to your website root directory
Frequently Asked Questions
It's not required but highly recommended. Without it, search engines will crawl everything, which may waste crawl budget on unimportant pages.