What is robots.txt?
The robots.txt file tells search engine crawlers which pages they can and cannot access on your site. Place it at yoursite.com/robots.txt. Use it to block admin pages, duplicate content, or private areas. Important: robots.txt is a suggestion, not a security measure — sensitive pages should use password protection.
Quick Presets
Add Custom Rules
Which search engine crawler this rule applies to
The URL path to allow or block. Use * as wildcard.
Sitemap URL (Optional)
Help search engines find your sitemap
Current Rules
No rules added yet. Use presets or add custom rules.
# robots.txt # Generated by SEO Toolkit User-agent: * Allow: /
💡 Common Paths to Block
/admin/- Admin panels/wp-admin/- WordPress admin/cart/- Shopping carts/search/- Search result pages/*?*- URLs with parameters/private/- Private content