Blogger site question

What do I do for this remark on my site “Googlebot blocked by robots.txt” on GSC? Any step by step guide will suffice. Thanks!

1 Like

1st things 1st

Check for and remove a noindex tag

1 Like

Read these guide to fix your issue:

1 Like

Here’s a step-by-step guide to fix this:

1. Confirm the Block:

There are two ways to confirm a page is blocked by robots.txt:

  • Using GSC (if your site is verified):
    1. Go to the URL Inspection tool in GSC.
    2. Enter the URL of the page showing the message in GSC.
    3. Check the “Page indexing” section. If it says “Blocked by robots.txt,” then the page is blocked.
  • Using a Robots.txt validator (if your site isn’t verified):
    1. Search for “robots.txt validator” online. There are many free options available.
    2. Enter the URL of the problematic page in the validator.
    3. If the validator says the page is blocked by Googlebot, then you need to fix the robots.txt file.

2. Fix the robots.txt file:

  • You’ll need to access your website’s file manager or FTP client to edit the robots.txt file.
  • Locate the line(s) that begin with “Disallow:”. These lines specify which directories or URLs Googlebot shouldn’t access.
  • If you find a “Disallow:” line that restricts access to the problematic URL, remove it or modify it to allow access.
  • Be cautious: Incorrect edits to robots.txt can block important parts of your site. If you’re unsure about editing the file yourself, consider seeking help from a developer.

3. Test and Submit (Optional):

  • Once you’ve edited the robots.txt file, you can use the robots.txt tester tool in GSC to validate your changes.
  • After you’re confident the changes are correct, submit the URL for indexing again using the URL Inspection tool in GSC. This will prompt Googlebot to revisit the page and potentially index it.
1 Like