Why does Google use “Allow” in robots.txt, when the standard seems to be “Disallow?”
4th February 2012
My answer to Why does Google use “Allow” in robots.txt, when the standard seems to be “Disallow?” on Quora
The Disallow command prevents search engines from crawling your site.
The Allow command allows them to crawl your site.
If you’re using the Google Webmaster tools, you probably want Google to crawl your site.
Am I misunderstanding your question?
More recent articles
- Notes from Bing Chat—Our First Encounter With Manipulative AI - 19th November 2024
- Project: Civic Band - scraping and searching PDF meeting minutes from hundreds of municipalities - 16th November 2024
- Qwen2.5-Coder-32B is an LLM that can code well that runs on my Mac - 12th November 2024