On Tue, Feb 15, 2011 at 05:56:36PM -0500, Peter Kirn wrote:
The notion is that the language side of things - Java, C++, Objective-C, Python, whatever -- will have the logic that determines how events are scheduled, and would handle user input that might alter the sequence of those events. The question is how best to have the *language* communicate with Pd.
So, the structure would be: external logic > message to Pd > qlist/textfile scheduling inside Pd
sound source in Pd > audio callback in the embedded instance
As you probably know, in RjDj/libpd the usual way to communicate between the App and the Pd instance is to send messages between both via some kind of socket (network or RPC or so). The main issue is one of synchronizing different "clocks": Pd has a clock inside, usually synchronized with the soundcard (which may be a virtual "dummy" card in libpd), but events from the App may not be synchronized to that, but instead to some GUI loop, network polling mechanism etc.
A GUI event may happen at a certain time compared to the GUI loop clock, but what would be the time, it should happen in Pd's "soundcard clock"? You have to find a some mechanism to reliably compare the different clocks, then it's easy to translate times between both time scales.
OSC with timetags would provide a way out, but even if the OSC overhead seems to be too much (which I doubt), you still need some kind of timetag for messages.
We'd continue to use libpd, but rather than assuming the Pd patch contains logic for how the events are scheduled, those would be integrated with the logic and interface contained in the code. (So, on iOS, for instance, in Objective-C or something.)
Actually I think, Pd is great for scheduling (musical) events, so I'd not put that into the interface code. The Pd side is probably, where the musicians and composers work and they are the ones who need to deal with sequences, timing, rhythms etc.
I'd figure that most graphical interfaces don't need such a tight subsample timing as Pd can provide, the eye is much less timing sensible than the ear.
It becomes slightly different when you deal with inputs, for example tapping in Tap-Tap-style games where timing and latency become more important. Any midi capable realtime audio software has to synchronize external midi events to their own, much faster clocks, so these apps may be worth a look (or Pd's midi objects).
Frank Barknecht Do You RjDj.me? _ ______footils.org__