Why does Google use “Allow” in robots.txt, when the standard seems to be “Disallow?”
4th February 2012
My answer to Why does Google use “Allow” in robots.txt, when the standard seems to be “Disallow?” on Quora
The Disallow command prevents search engines from crawling your site.
The Allow command allows them to crawl your site.
If you’re using the Google Webmaster tools, you probably want Google to crawl your site.
Am I misunderstanding your question?
More recent articles
- GPT-5.2 - 11th December 2025
- Useful patterns for building HTML tools - 10th December 2025
- Under the hood of Canada Spends with Brendan Samek - 9th December 2025