as there aren't that many objects being release every day, it wouldn't be that much work.
There's probably a script that can be written to count the number of help patches that still need META subpatches. I have the feeling it's
something like 1500 or so.
I would guess something like that, yes. hopefully more every day.
I would have no problem doing this, as long as there is some waranty that my time won't be wasted by leaving the suggestions
somewhere in a mailbox, and not commiting the patches.That's ultimately decided by the author of the library. You have to
contact each one and ask if you can add those changes. For some authors it's trivial.
For others, like Miller, a separate location for the updated help had to be created
because indeed the suggestions just sit in a mailbox somewhere otherwise.
if a line saying something like "if you don't create a documentation, one
will be created for you." is added to the commit guideline, everyone will
agree to it automatically.
I guess someone that commits code wants it to be used. This will help
him/her making sure that the code is seen by other users.
that would a) release the developers of the shore of doing a good
documentation (many don't really do it),I think it'd be a better idea to assume that undocumented objects are
either under current development or are crap-- either way, they shouldn't be included
in a release of Pd. In most cases a minimal example patch and a few sentences (or
even single words) to describe the xlet/object function is all that is needed. If
the developer couldn't be bothered to do that, what else couldn't he/she be bothered to do in
the code that runs on your machine and can potential crash your patch?
I also agree with that. Therefore the need to have a way of making sure
that the existance of a complete documentation for each library is
controlled is important. I would even say that all externals which aren't
completely documented (like mine, I haven't filled the jmmmp-meta.pd patch
yet) could be stored in a /undocumented folder, that stays put at svn and
doesn't build with the package.
Thing is, is there a way to do that control/purge automatically? The
xxx-help patches can exist as files, but without opening them it's not
possible to see if they're complete (and if the externals work). If not,
it's necessary that a control method with enough intelligence goes through
the libraries - I can only think of a user opening the patches and trying
them out.
b) in the long run maybe have a unified documentation for the whole pd-ext,
Without automated templates like matju's Gridflow documentation system
this is just too much work.
it is for the x000 objects that are already there. once we go through that
(with no rush), there will be less missing. new objects don't come in that
often.
with testing an external I meant opening the documentation, check for
content, put some input in the external and see if the output makes some
sense. slowly, one can go library through library.
c) give a model for new (and old?) developers to prepare their material, if they want to,
See above.
d) solve one big problem with new users, the unstructured/unfriendly documentation, and e) world
peace.Unstructured docs are fine if they a) exist and b) are discoverable.
then pd-ext is perfect the way it is now. everything exists, and is
discoverable, if you look for it.
one tiny detail is that no one really knows which objects exist, because
there is no central list (the more complete is my own hand-made
copy-pasted xls file).
but resuming, I would support a method where a library doesn't get
commited unless is documented (including pd-meta patches, a textfile with
a list of objects and function[?], and whatnot). that will take around
lots of externals in the next build, but hopefully they'll come back soon
enough.