hi,
Does using [auto $1< to play a video file with
pix_film take less CPU
than a counter that increments with every render?
On which platform? The answer is often 'yes' on
OSX. and on linux + w32 it is 'no' (being merely a shortcut for building your own counter)
However, I found that pix_film seems to decode frame(s) anytime a render command arrives at its inlet, even if it is the same frame as with the last render command (bah, twisted explanation ... ) i therefore made myself a little subpatch that decodes frames only if they are different from the one decoded at the last render command (using pix_buffer), which proved to cut cpu usage by half or so, depending on the frame rates. Thinking about this again - could it be that uploading a texture to the gpu takes up some cpu cycles as well? pix_buffer prevents these transfers as well, iirc.
hope I'm making sense here, thoralf.
___________________________________________________________ Yahoo! Messenger - NEW crystal clear PC to PC calling worldwide with voicemail http://uk.messenger.yahoo.com