Hey all,
memento has the ability to store sets of states, and if you just interpolate these states you get a pretty cute sequencer. This is in pixelTANGO currently.
As for API this is a big questions, and I suppose Frank (Mr Memento) probably has some ideas about how RRADICAL patches get contributed to, style consistancy etc..
In pixelTANGO a "layer" is just a gem-chain, so the whole thing just needs to be connected via seperator and whatever objects can be added that you want. I spent a lot of time working on the plugin architecture for pixelTANGO, which could be very handy (I imagine for PiDiP/PDP abstractions) it works as so:
There is a special directory in the search path (fx/ in the case of pixes) This directory gets listed and parsed and an abstraction is created in PD for each of the abstractions. The abstractions then special abstractions that allow it to communicate with the outside patch (two inlets and two outlets). On the GUI side a popupbox shows the names of all of the abstractions in the array, and by choosing one the input/output gets switched for that abstraction. Right now it is used for pix_ effects but technically any object that operates on a gem-chain will work in there. I was going to add other modules for different classes of "plugins" from geos to ___?
The bonus with the system is that you make your own abstraction (based on the template of fx/negative.pd for example) drop it into the fx/ folder and the next time you open that patch the popup automatically contains a reference to your new abstraction. In theory it should be no problem to update the popup and abstractions at the touch of a special message, but the timing can be tricky.
Any other ideas about consistancy/API/Collaboration methods for this explosion of "vj" patches?
b>
Thoralf Schulze wrote:
Hi - erm, hard off :-)
some kind of 'sequencer' would be nice for the video.
yeah, definitely. I think this can be done by changing the visibility of the individual renderchains, and those changes can be triggered by metros (periodical bangs) - so this should be possible within the current framework. There will also be a cheap sampler - ie., you can internally capture your output and feed it back to tvjt.
definately include as much audio-interactivity as you can. ideally you want to be able to trigger video events by audio events such as drum beats...also by midi input...and then the best would be the option of using audio signals to generate video.
beat detection is already there, and so is processing of external midi signals. these events trigger bangs that in turn trigger envelope generators, it is possible to adjust each and every aspect of a render chains with these values. I'd love to add more audio-related stuff, but I'm not exactly an audio guy and don't have much of a clue int his respect ... I'm using bonk for the beat detection, which works great. There is some sort of treble detection as well, which is more or less reliable (basically it's a hip~ that triggers a bang if a certain threshold in the amplitude is exceeded). If you have any suggestions on more audio feature extraction, I'd be glad to implement them :-)
also, it would be important for me to be able to feed in the output of my own patches to the video stream.
there will be some sort of an api, I'll also document the internal message structure so it should be relatively easy to add more filters etc. and it would definitely a good idea to add an input for external sources. that should be easy enough, it's merely a matter of importing a gem render chain. I actually talked to Ben Bogart about this, maybe we can settle on some sort of quasi-standard that would allow to share stuff across different gem applications.
thanks & with kind regards, thoralf.
Does your mail provider give you FREE antivirus protection? Get Yahoo! Mail http://uk.mail.yahoo.com
PD-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list