Today, while I was writing my Google Search Guide, I was playing with the “..” google operator, and after some search requests I bumped in this 403 page:

google hates doughnuts.

We’re sorry…

… but your query looks similar to automated requests from a computer virus or spyware application. To protect our users, we can’t process your request right now.

We’ll restore your access as quickly as possible, so try again soon. In the meantime, if you suspect that your computer or network has been infected, you might want to run a virus checker or spyware remover to make sure that your systems are free of viruses and other spurious software.

We apologize for the inconvenience, and hope we’ll see you again on Google.

To continue searching, please type the characters you see below:

Wooha, maybe I used a really strange search string…

Nope: my search string was “1..10 doughnuts”,

Ok, others have reported similar behaviors, but they were doing a bit more advanced (and suspect) searches .. what the hell is suspect in a doughnut search?? Anyway, after some research I found something more…

It seems there are two levels of filter… one with the captcha (as in the first picture) and one without the captcha, like the following one (search was inurl:.edu “1..10 doughnut OR doughnuts”)
google hates doughnuts

This one entirely forbids you to have your request served.

I’m ok with the captcha: it’s a bit annoying but if you’re human you can work it out, but what about the non-captcha page? A search engine that prevents you to use it’s own search functions? That’s so lame!

It also seems that some search strings can be directly marked as spammy (like inurl:.edu “1..10 doughnut OR doughnuts” ) while others are marked as spammy if they comes with lots of requests from the same IP. (try searching “1..10 doughnuts” a dozen of times in a spell)

I understand google’s need to protect itself from spammers, but what about our need for doughnuts?

francesco mapelli