Robots.txt is a config format created in 1994 by Martijn Koster.
#2320on PLDB | 30Years Old |
A robots.txt file tells search engine crawlers which URLs the crawler can access on your site.
User-agent: googlebot # all Google services
Disallow: /private/ # disallow this directory
User-agent: googlebot-news # only the news service
Disallow: / # disallow everything
User-agent: * # any robot
Disallow: /something/ # disallow this directory
Feature | Supported | Example | Token |
---|---|---|---|
Comments | ✓ | # A comment | |
Line Comments | ✓ | # A comment | # |