Hi all!
I'm developing an installation which currently uses six webcams (Logitech Vision Pro, but tried also lots of other cameras). I'm using Gem patches on Linux and OS X and realized that OS-X's CPU consumption is much less than on linux. I made tests with a single webcam on Linux and single (and multiple webcams) on OS X. Both computers are not directly comparable, because of different hardware, but maybe this is interesting anyways.
The Linux computer: AMD Athlon64, 2.2 GHz, GeForce 7600 GS, nvidia proprietary driver, Kernel 2.6.30-ARCH (and also other kernels from debian and ubuntu), webcam @ 640x480, 30 FPS. Measurements of CPU consumption are done using htop, on the above described computer.
The OS-X 10.5 Computer: Macbook pro 2x2.2 GHz, Geforce M 8xxx (don't know exactly, it's not my computer, but I can look more accurate if someone wants to know.) Measurements are made by using top (and OS-X activity monitor, which seem to be the same)
Linux: when using one webcam in cheese or luvcview it's using approximately 40% CPU usage. when using the same webcam in Gem's pix_video help Patch it's approximately 80% CPU usage
OS X: didn't test simple application like cheese, but: one webcam in pix_video help patch uses 15% cpu. Gem, single webcam: approx. 15% usage Gem, three webcams: approx 20%
For comparison in other programs on Linux and with other video content
in Gem:
Gem, pix_movie (352x288, 25 fps, mpg) 10% cpu usage
Gem pix_movie (720x480, 24 fps, quicktime) 59% CPU
mplayer - vo gl: (720x480, 24 fps, quicktime) 62% cpu
mplayer -vo null: (720x480, 24 fps, quicktime) 40% cpu (40% for
decoding? or is there already some system overhead?)
so it seems "normal" to have such huge cpu consumptions for video replay and cameras on linux? I didn't measure it, but I also had extremely high cpu usage when using firewire dv cameras some time ago.
where does this big difference come from? does OS-X use video decoding on graphics card or something similar?
any ideas for making video faster on Linux?
cheers Martin