Why does Google use “Allow” in robots.txt, when the standard seems to be “Disallow?”
4th February 2012
My answer to Why does Google use “Allow” in robots.txt, when the standard seems to be “Disallow?” on Quora
The Disallow command prevents search engines from crawling your site.
The Allow command allows them to crawl your site.
If you’re using the Google Webmaster tools, you probably want Google to crawl your site.
Am I misunderstanding your question?
More recent articles
- New audio models from OpenAI, but how much can we rely on them? - 20th March 2025
- Calling a wrap on my weeknotes - 20th March 2025
- Not all AI-assisted programming is vibe coding (but vibe coding rocks) - 19th March 2025