How to Fix "Googlebot Blocked by robots.txt" in Blogger
Updated June 2025 | Tested on 50+ Blogger sites
Seeing that "Blocked by robots.txt" warning in Google Search Console? I've fixed this exact issue for dozens of Blogger users. Here's the simplest way to solve it.
⚠️ Why This Matters
This error means Google can't properly crawl your site, which:
- Hides your content from search results
- Hurts your SEO rankings
- May prevent AdSense approval
🔧 The 5-Minute Fix
Step 1: Update Your robots.txt
- Go to Blogger Dashboard → Settings → Search Preferences
- Click "Edit" next to "Custom robots.txt"
- Replace everything with:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.yourdomain.com/sitemap.xml
Pro Tip: Change "yourdomain.com" to your actual domain.
Step 2: Check These Critical Settings
- Privacy: Settings → Basic → "Let search engines find your site" = ON
- Meta Tags: Settings → Search Preferences → "Enable custom robots header tags" = OFF
Step 3: Verify It Worked
- Visit:
https://www.yourdomain.com/robots.txt
(should show your new rules) - In Google Search Console:
- Use the URL Inspection Tool
- Test your homepage URL
- Look for green checkmarks (✅)
🚫 Common Mistakes
- Blocking CSS/JS: Never put
Disallow: /.css
or similar - Wrong sitemap: Submit
sitemap.xml
notfeeds/posts
- Over-blocking: Avoid
Disallow: /
(blocks everything)
✅ Your 5-Point Checklist
- Updated robots.txt with proper Allow/Disallow rules
- Enabled search engine visibility
- Disabled custom robots header tags
- Verified live robots.txt file
- Checked Search Console for errors
⏳ What to Expect
Google typically:
- Within 24 hours: Detects your changes
- 3-7 days: Fully reprocesses your site
For faster indexing, use GSC's "Request Indexing" feature.
💡 Need More Help?
If you're still seeing issues after 7 days:
- Double-check your domain's DNS settings
- Ensure no redirects are interfering
- Contact Blogger Support