How to Test your Robots.txt File with Robots.txt tester in Blogger

What is Robots.txt Testing Tool?
The robots.txt tester tool provides you detailed information whether your current robots.txt file is blocking Google search crawlers from accessing any specific URLs on your site. To make it simpler, you can use this tool to test whether Google bot crawlers can crawl the URL of a page that you wish to block from Google search engine.

How to Test your Robots.txt File with Robots.txt Testing Tool:
The very first thing you need to do is to login to Google Webmaster tool, then go to Robots.txt tester and from the list of your verified properties select the one which you would like to test.

Now you will see your current robots.text file, you can test different URLs to see if google crawlers are disallowed from crawling them or not. Type a URL in the text box present at the bottom of the page and press Test button.




Here the test button will either change to "ACCEPTED" or "BLOCKED", it depends whether the URL you enter has blocked the access of Google crawlers or not.



Make changes to the robots.txt file according to your needs and retest the file as much as needed until you are satisfied. Once you are done customizing and have finished writing new rules to the file, copy the whole code and paste it on your robots.txt file hosted on your site. 

Note: This tool does not changes your robots.txt file, so you have to upload a fresh file on your own. This tool only tests the against the copy hosted in the tool

Post a Comment

Post Your Comments

Previous Post Next Post

Contact Form