I'd like to change the robots.txt for puredata.info so that it allows everything, and search engines can index it. Are we going by lazy consensus on this?
.hc
On Jun 11, 2010, at 4:32 PM, Hans-Christoph Steiner wrote:
I just found one reason why we don't get a lot of traffic to puredata.info, it seems that the robots.txt is set to disallow all robots, meaning no search engines are indexing the site. I think it would be very valuable if the content in puredata.info was in the search engines.
http://puredata.info/robots.txt
http://validator.w3.org/checklink?uri=http%3A%2F%2Fat.or.at%2Fhans%2Fpd%2Fin...
.hc
Looking at things from a more basic level, you can come up with a more direct solution... It may sound small in theory, but it in practice, it can change entire economies. - Amy Smith
----------------------------------------------------------------------------
Using ReBirth is like trying to play an 808 with a long stick. - David Zicarelli
On 2010-06-15 20:33, Hans-Christoph Steiner wrote:
I'd like to change the robots.txt for puredata.info so that it allows everything, and search engines can index it. Are we going by lazy consensus on this?
i would prefer if robots would not constantly pull 4GB of data. this is mainly relevant for media-data (patches, movies, presentations, ...) which is probably not so relevant for searchbots either.
so i would consent on a robots.txt that does not allow everything for everyone but rather restrict bots to text-pages.
fmasdr IOhannes
On Jun 28, 2010, at 4:38 AM, IOhannes m zmoelnig wrote:
On 2010-06-15 20:33, Hans-Christoph Steiner wrote:
I'd like to change the robots.txt for puredata.info so that it allows everything, and search engines can index it. Are we going by lazy consensus on this?
i would prefer if robots would not constantly pull 4GB of data. this is mainly relevant for media-data (patches, movies, presentations, ...) which is probably not so relevant for searchbots either.
so i would consent on a robots.txt that does not allow everything for everyone but rather restrict bots to text-pages.
fmasdr IOhannes
I'm sure the robots are also interested in not constantly pulling 4GB files down, I imagine they try to avoid that.
I'm fine with a text only robots but it seems not worth the effort and I wouldn't know how to do it. I do know that I have many gigs of files on my own website, and robots are constantly hitting it. My site is hosted on my home internet connection and I have never noticed a problem with robots.
.hc
----------------------------------------------------------------------------
I spent 33 years and four months in active military service and during that period I spent most of my time as a high class muscle man for Big Business, for Wall Street and the bankers. - General Smedley Butler
On 2010-06-28 20:01, Hans-Christoph Steiner wrote:
I'm sure the robots are also interested in not constantly pulling 4GB files down, I imagine they try to avoid that.
but usually they don't really check.
I'm fine with a text only robots but it seems not worth the effort and I wouldn't know how to do it. I do know that I have many gigs of files on my own website, and robots are constantly hitting it. My site is hosted on my home internet connection and I have never noticed a problem with robots.
that might be the clue: on your homeinternet connection you might never get into the range of (D)DoS.
mfgasdf IOhannes
On Jun 29, 2010, at 11:26 AM, IOhannes m zmoelnig wrote:
On 2010-06-28 20:01, Hans-Christoph Steiner wrote:
I'm sure the robots are also interested in not constantly pulling 4GB files down, I imagine they try to avoid that.
but usually they don't really check.
A robot can easily drop the download once its gotten a few megs. Are you sure that the bots are actually downloading the whole file? I doubt they do.
I'm fine with a text only robots but it seems not worth the effort and I wouldn't know how to do it. I do know that I have many gigs of files on my own website, and robots are constantly hitting it. My site is hosted on my home internet connection and I have never noticed a problem with robots.
that might be the clue: on your homeinternet connection you might never get into the range of (D)DoS.
Um, why not? There is no magic to home internet connections that protects them from DDoS.
.hc
----------------------------------------------------------------------------
'You people have such restrictive dress for women,’ she said, hobbling away in three inch heels and panty hose to finish out another pink- collar temp pool day. - “Hijab Scene #2", by Mohja Kahf
On 2010-07-01 03:42, Hans-Christoph Steiner wrote:
Um, why not? There is no magic to home internet connections that protects them from DDoS.
there is, and it's called "speed".
the term "DoS" might be a bit harsh here (as crawlers don't really intend to attack your host), but the effect is the same.
fgamsd IOhannes
On Jul 1, 2010, at 4:21 AM, IOhannes m zmoelnig wrote:
On 2010-07-01 03:42, Hans-Christoph Steiner wrote:
Um, why not? There is no magic to home internet connections that protects them from DDoS.
there is, and it's called "speed".
the term "DoS" might be a bit harsh here (as crawlers don't really intend to attack your host), but the effect is the same.
fgamsd IOhannes
Oh, you're saying my home cable modem is faster than IEM's internet connection?
.hc
----------------------------------------------------------------------------
¡El pueblo unido jamás será vencido!
On Jul 1, 2010, at 3:23 PM, Hans-Christoph Steiner wrote:
On Jul 1, 2010, at 4:21 AM, IOhannes m zmoelnig wrote:
On 2010-07-01 03:42, Hans-Christoph Steiner wrote:
Um, why not? There is no magic to home internet connections that protects them from DDoS.
there is, and it's called "speed".
the term "DoS" might be a bit harsh here (as crawlers don't really intend to attack your host), but the effect is the same.
fgamsd IOhannes
Oh, you're saying my home cable modem is faster than IEM's internet connection?
What about specifying the large files in the robots.txt file and letting the rest be indexed? It seems to me that /docs, /dev, and / exhibition are all just text/HTML and maybe some images so they could be allowed. Or maybe just turn off indexing on /Members and allow it everywhere else.
.hc
----------------------------------------------------------------------------
Terrorism is not an enemy. It cannot be defeated. It's a tactic. It's about as sensible to say we declare war on night attacks and expect we're going to win that war. We're not going to win the war on terrorism. - retired U.S. Army general, William Odom
On 2010-07-01 21:32, Hans-Christoph Steiner wrote:
Oh, you're saying my home cable modem is faster than IEM's internet connection?
am i?
how about: the speed of your home cable modem prevents the server to be queried (and server requests) at a speed that overloads it?
What about specifying the large files in the robots.txt file and letting the rest be indexed? It seems to me that /docs, /dev, and /exhibition are all just text/HTML and maybe some images so they could be allowed. Or maybe just turn off indexing on /Members and allow it everywhere else.
might work
fgasdr IOhannes
On Jul 1, 2010, at 4:34 PM, IOhannes m zmoelnig wrote:
On 2010-07-01 21:32, Hans-Christoph Steiner wrote:
Oh, you're saying my home cable modem is faster than IEM's internet connection?
am i?
how about: the speed of your home cable modem prevents the server to be queried (and server requests) at a speed that overloads it?
What about specifying the large files in the robots.txt file and letting the rest be indexed? It seems to me that /docs, /dev, and / exhibition are all just text/HTML and maybe some images so they could be allowed. Or maybe just turn off indexing on /Members and allow it everywhere else.
might work
fgasdr IOhannes
How about this:
User-agent: * Disallow: /Members/
.hc
----------------------------------------------------------------------------
"A cellphone to me is just an opportunity to be irritated wherever you are." - Linus Torvalds
On Jul 1, 2010, at 4:34 PM, IOhannes m zmoelnig wrote:
On 2010-07-01 21:32, Hans-Christoph Steiner wrote:
Oh, you're saying my home cable modem is faster than IEM's internet connection?
am i?
how about: the speed of your home cable modem prevents the server to be queried (and server requests) at a speed that overloads it?
What about specifying the large files in the robots.txt file and letting the rest be indexed? It seems to me that /docs, /dev, and / exhibition are all just text/HTML and maybe some images so they could be allowed. Or maybe just turn off indexing on /Members and allow it everywhere else.
might work
fgasdr IOhannes
http://puredata.info/robots.txt
.hc
----------------------------------------------------------------------------
I have the audacity to believe that peoples everywhere can have three meals a day for their bodies, education and culture for their minds, and dignity, equality and freedom for their spirits. - Martin Luther King, Jr.