Except that it is written:
Sometimes this happens when the host notices overactive crawling from particular bots and blocks them. This is always something that a site owner who uses shared hosting should watch out for....
How is it the site owners problem when a search engine's bot is "overactive"? If the problem is not censorship, it is the poor search algorithm which keeps searching the same root page and yet is not "overactive" on the non-root pages. Unless the bot visits the root and then the first link, goes back to the root then the second link, then back to the root.... Hence, every non-root page gets visited once but the root page gets as many visits as there are links from it and eventually, the bot gets blocked from the root page for "overactivity" but exactly whose fault is that? A smart bot would visit the home page once, get all the links and follow them once and remember where it came from and will not follow a link back to the home page. I blame the search bot! It happens for Microsoft also. No root page. :-D
~~~~ Yep! This is my message signature. Not feeling very creative today.