Sitemap Generator

Robots.txt Generator - <data:blog.title/>

Robots.txt Generator for Blogger

Important: Blogger automatically generates robots.txt files. Use this reference to verify or create custom rules through Blogger's official settings.

Recommended Configuration

User-agent: *
Disallow: /search
Disallow: /p/
Allow: /

Sitemap: sitemap.xml

Custom Rules Explanation

# Block specific user-agents
User-agent: BadBot
Disallow: /

# Allow main content
User-agent: *
Allow: /feeds/posts/default
Allow: /feeds/posts/summary

# Block sensitive areas
Disallow: /search
Disallow: /p/  # Old post URLs
Disallow: /comments/

Implementation Steps:

  1. Go to Blogger Dashboard → Settings → Crawlers and indexing
  2. Add custom rules in "Custom robots.txt content" section
  3. Combine with meta tags for better control
  4. Always test using Google Search Console

Best Practices

  • Allow access to CSS/JS files
  • Block duplicate content pages (/search, /archive)
  • Use noindex meta tags instead of disallow for sensitive pages
  • Submit sitemap through Google Search Console