On 2010-06-28 20:01, Hans-Christoph Steiner wrote:
I'm sure the robots are also interested in not constantly pulling 4GB files down, I imagine they try to avoid that.
but usually they don't really check.
I'm fine with a text only robots but it seems not worth the effort and I wouldn't know how to do it. I do know that I have many gigs of files on my own website, and robots are constantly hitting it. My site is hosted on my home internet connection and I have never noticed a problem with robots.
that might be the clue: on your homeinternet connection you might never get into the range of (D)DoS.
mfgasdf IOhannes