Robots.txt Checker

Robots.txt files (often erroneously called robot.txt, in singular) are created by webmasters to mark (disallow) files and directories of a web site that search engine spiders (and other types of robots) should not access.

This robots.txt checker is a "validator" that analyzes the syntax of a robots.txt file to see if its format is valid as established by Robot Exclusion Standard (please read the documentation and the tutorial to learn the basics) or if it contains errors.

spacer Simple usage: How to check your robots.txt file format? Just insert the full URL (Example: www.yourdomain.com/robots.txt) of the robots.txt file you want to analyze and hit Enter
spacer Powerful: The checker finds syntax errors, "logic" errors, mistyped words and it gives you useful optimization tips
spacer Accurate: The validation process takes in account both Robots Exclusion Standard rules and spider-specific (Google, Inktomi, etc.) extensions (including the new "Sitemap" command).

This robots.txt analyzer is provided by Motoricerca, a non-profit italian guide to web site optimization and search engine positioning.

Robots.txt full URL (Example: www.domain.com/robots.txt):


If you'll find the analysis of this robots.txt validator useful, please think about offering a link to this page from your web site.

gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.