If you use some kind of time-tagging, which some implementations of
OSC support, then you add add a bit of latency to the whole system and
use that to reduce the inter-machine latency. The basic idea is that
each message has a time tag that marks when that message should take
effect. Then you put that time tag 20ms in the future when you send
it, every machine should have it within 20ms, and then they'll all
execute the message at the same time.
.hc
On Oct 8, 2009, at 6:27 AM, Brian FG Katz wrote:
Dear PD-ers,
We are working on an installation with 4 machines running PD to feed
157 loudspeakers. We are interested in reducing latency to a minimum
between channels, and especially between machines. The inter-channel latency
in less than a sample, so all is fine there. For inter-machine latency, we
arrive at differences on the order of 10msec, close to our minimum audio- buffer length of 11msec. Any small audio-buffer and we get audio artifacts.We are using a word clock synchronizer (Nanosyncs HD; Rosendahl),
but I don't think that does latency synchronization.My question, is there another means to improve inter-machine latency performance other than reducing the audio-buffer?
-Brian
Brian FG Katz, Ph.D Audio & Acoustique LIMSI-CNRS BP 133 F91403 Orsay France tel. (+33) 01 69 85 81 55 fax. (+33) 01.69.85.80.88 e-mail Brian.Katz@limsi.fr web_theme: http://www.limsi.fr/Scientifique/aa/thmsonesp/ web_group: http://www.limsi.fr/Scientifique/aa/
Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
Information wants to be free. -Stewart Brand