I am also new to this list, so that I won't make many judgements about what is "off" here. I intuit that asking for downloading tutorial files that are learning materials for the main topic of the list, is not "off topic". But again, I am neither a list veteran nor a list admin. Just one more "Joe" around here.
Anyways, for the benefit of Jared and of others interested in downloading the examples/tutorial/workshop from CVS, I am attaching a script, which surely can be adjusted/improved in several ways. The script is just what worked for me. It attempts to save some time in the download process by not using a large parameter -l (directory depth), which has a multiplicative impact on the time/space performance. It is a very rough first-cut thing.
Only a couple of comments about the script. I ran this in FC5.
During my PhD research on usage of wget (it is about a PhD), I found a couple of annoying bugs in the Fedora Core 5 version of wget (like the semantics of the -nc parameter buggily changed by RedHat: the program stops/"Abort"/bails out when given -nc and it finds an already downloaded file - not the wget intended semantics, and some strange behaviour of the -X (exclude) parameter). So basically, in Fedora Core 5 the download can not be resumed, it needs to be restarted from scratch. Not much to complain here as Fedora is a beta version of a product, not a final product.
The above is bad news: this download takes about 5+ hours on a DSL line at 500Mbps: The total data transmitted is about 500MB (that's right: 1/2 GB). That is because of the vast CVS web interface auto-generated html material present that needs to be downloaded. The script keeps only the desired .pd and some other (.png, .txt, etc.) files. The .html overhead files are not saved.
By far, this is a rather absurd way to download the examples, but given that IOhannes suggests, then who am I to object it ... It would be beneficial to have a simple tgz package that can be downloaded in 40 seconds (the true size of the working files is like 5MB (1% of the communication). This package could be updated once-a-month, say, and posted for easy download. I can prepare such if that is deemed appropriate by the bosses.
The short answer to your problem, Jared, is that wget honors the "robots.txt" file already mentioned in this thread. To get this download to work, you need to use the -erobots=off parameter. The rest of the params are "fine tuned" details. Look at the man page for details.
All that said, hope that the script is of help to Jared and others.
Peter
From: "jared" microcosm11@msn.com To: "'IOhannes m zmoelnig'" zmoelnig@iem.at CC: pd-list@iem.at Subject: Re: [PD] PD Workshop files Date: Tue, 6 Mar 2007 09:17:02 -0000
Okay. You've successfully scared me away. I don't feel the need to justify myself or my confusion.
Someone obviously realizes that they are being rude if they feel the need to say 'I hope I do not sound too rude'.
Leaving the computer screen every once in a while is a good thing. It helps you communicate better with humans.
My apologies to those on the list for any inconveniences that I might have caused with continuing this thread in the main list instead of the off topic list.
Good luck.
-----Original Message----- From: IOhannes m zmoelnig [mailto:zmoelnig@iem.at] Sent: Tuesday, March 06, 2007 8:19 AM To: jared Cc: pd-list@iem.at Subject: Re: [PD] PD Workshop files
jared wrote:
Seems like it worked and you already have the files.
Have a look in your current working directory.
Okay, I checked my working directory. The only thing it seems
to
be downloading is the index page...not files within--in my working directory, I only have an explorer link to the index page
whoa!
your problem is _exactly_ described in the wget faq, of which i posted a link several days ago.
please see http://www.gnu.org/software/wget/faq.html#3.0
(to be honest, i anticipated this problem; that is why i directed you to the faq)
furthermore, if you do have problems with any software that is unrelated to pd (e.g. proper inline quotes in emails) and you really want to ask people here, then i suggest to use pd-ot, a mailing list dedicated to off topic stuff. (and yes, you should have been informed that such list exists, either via the mail response when subscribing to pd-list or via the webinterface where you did subscribe)
that said, i hope i do not sound too rude and wish you good luck (sic!).
mfg.ar IOhannes
Please could you post the exact command line given
and we can see any possible errors.
See attached tif file. Thanks guys! jared
PD-list@iem.at mailing list UNSUBSCRIBE and account-management ->
http://lists.puredata.info/listinfo/pd-list
PD-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
The average US Credit Score is 675. The cost to see yours: $0 by Experian. http://www.freecreditreport.com/pm/default.aspx?sc=660600&bcd=EMAILFOOTE...
Hallo, Pete Redest hat gesagt: // Pete Redest wrote:
Anyways, for the benefit of Jared and of others interested in downloading the examples/tutorial/workshop from CVS, I am attaching a script, which surely can be adjusted/improved in several ways. The script is just what worked for me.
Please never use this script!!!
It's bad practice, not polite at all and in some circles even considered extremely rude to ignore the instructions in "robots.txt"!
And it's not necessary at all: Instead of abusing wget as a CVS-tool, one should just get comfortable with a real CVS-utility.
Frank Barknecht _ ______footils.org_ __goto10.org__
Totally agree with you. This is NOT the way to get stuff from a CVS repo. Only posted it following the insistence that I perceived in using wget for this. I mistakenly took some observations by IOhannes as suggesting to go this way, so that I created the script... IOhannes posted a clarification in the meantime. I stand corrected.
Indeed, DON'T USE this script. It is a very awkward way to get the files, very inefficient, defeats the workings of CVS and is against good cyber- manners.
Best, -P
From: Frank Barknecht fbar@footils.org To: pd-list@iem.at Subject: Re: [PD] PD Workshop files Date: Thu, 8 Mar 2007 10:36:35 +0100
Hallo, Pete Redest hat gesagt: // Pete Redest wrote:
Anyways, for the benefit of Jared and of others interested in
downloading
the examples/tutorial/workshop from CVS, I am attaching a script, which surely can be adjusted/improved in several ways. The script is just what worked for me.
Please never use this script!!!
It's bad practice, not polite at all and in some circles even considered extremely rude to ignore the instructions in "robots.txt"!
And it's not necessary at all: Instead of abusing wget as a CVS-tool, one should just get comfortable with a real CVS-utility.
Ciao
Frank Barknecht _ ______footils.org_ __goto10.org__
PD-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
Rates near 39yr lows! $430K Loan for $1,399/mo - Paying Too Much? Calculate new payment http://www.lowermybills.com/lre/index.jsp?sourceid=lmb-9632-18226&moid=7...
If someone took the trouble to write a robots.txt do please observe it. A website with limited bandwidth can take a big hit from too many full spiders. In resonance with another active thread, I still take it as an old fashioned duty of "hacker ethic" that we have to lead by example in an age where the law and business are no longer a reliable guide.
I'm happy to spend time with any beginner off-list to demonstrate CVS or wget, seriously just drop me a line and we'll go through it, but it's just a little off-topic to be filling the list traffic with, and IOhans and Frank have already given the essentials. Please read those FAQs again guys.
On Thu, 8 Mar 2007 10:36:35 +0100 Frank Barknecht fbar@footils.org wrote:
Hallo, Pete Redest hat gesagt: // Pete Redest wrote:
Anyways, for the benefit of Jared and of others interested in downloading the examples/tutorial/workshop from CVS, I am attaching a script, which surely can be adjusted/improved in several ways. The script is just what worked for me.
Please never use this script!!!
It's bad practice, not polite at all and in some circles even considered extremely rude to ignore the instructions in "robots.txt"!
And it's not necessary at all: Instead of abusing wget as a CVS-tool, one should just get comfortable with a real CVS-utility.
Ciao
Frank Barknecht _ ______footils.org_ __goto10.org__
PD-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
hi
Pete Redest wrote:
but given that IOhannes suggests, then who am I to object it ...
sorry if this is the impression i give.
apart from that, i never suggested using wget, instead i suggested using CVS.
however, somebody has previously suggested to jared to use wget, and he continued asking about _that_ way to get the files. so i answered the requests, in whatever manner.
(note, that the original question was bout downloading a directory structure NOT in CVS; it was only later that the same approach was used to get the CVS files exposed at the webinterface via wget)
All that said, hope that the script is of help to Jared and others.
i jared was on linux, that might have helped. to my knowledge, these have been questions from an xp user.
mfgasd.r IOhannes