On 08/16/2010 05:37 PM, Mathieu Bouchard wrote:
If a movie decoding induces frame skipping, the frame skipping consumes a lot more CPU than the normal decoding. I've seen this happen in MAX as well.
The slowness and high CPU consumption happens even with "auto 1": shouldn't frame skipping happen only for speeds greater than the original framerate (that is, only when you actually have to skip frames)?
OR, maybe, it could also be expectable, if the implementation is not "overly smart", that it would do frame skipping (even to skip from frame n to frame n itself) whenever "auto" mode is turned off
Can you put it in slow motion (play at a lesser fps than what's indicated by the file) and confirm that below a certain wanted fps, it starts to take radically less real time per logical time ? that should be when it stops skipping frames.
I don't understand that very well: it should stop "skipping" frames as soon as the wanted fps is _equal_ to the file's fps; if you go much lower, I would guess it would actually not only stop "skipping" frames, but stop _decoding_ frames when it just repeats the same frame
Indeed, if it started taking less real time per frame only when demanded a _very_ low fps, I would rather interpret that it is actually doing "frame skipping" every time it is asked for a new frame, whichever it is (even the next one), except when it is the same frame it is already at.
...what am I missing?