am 12.03.2004 22:03 Uhr schrieb chris clepper unter cgc@humboldtblvd.com:
On Mar 12, 2004, at 7:29 PM, Max Neupert wrote:
I am working on a project where I want to confront the observer with a projection that changes according to his/her position, thus eliminating the effects of perspective.
So I tried that, the problem is that the object just understands the RGB colorspace, converting the YUV camera stream first seems quite a processing task for the computer.
Actually, pix_movement does work with YUV - get the CVS version of GEM. I wrote an Altivec version of it for PPC. The object is quite fast and uses well under 10% CPU on a 1Ghz G4 running 720x480 video at 30fps.
Oh, wow, franz hidgen helped me to install the CVS version of GEM and it was definitely worth it, it does what you said. Thank you.
I succeded in creating a gem patch according to the tutorial patch which does something, but there is no working tracking (see attached patch)
The only problem I see is that you haven't given pix_movement a threshold argument. Send something like 0.1 to the right inlet to make it do it's thing. The only way I've been able to get decent tracking out of movement + blob is to use some sort of data smoothing object like hyperspasm's smooth object or even a plain old GEM average object. Without this the output is too erratic. Also, slow movement works a whole lot better than fast for these objects.
I'm in the process of writing a new luma based tracking object that might get finished beta testing at some point in the near future. It will spit out a grid of 1 and 0 based on the comparing the luma in each grid coordinate to the luma value you are looking for. The output is a generic pd list for you to use in whatever way you see fit.
I've attached a simplified version of your patch.
Thank you a lot.
Even better than comparing the sequencing frames to each other would be for my purpose to let the camera grab a frame of the empty stage to later compare those to the ones with spectators. Somebody out there who has done that?
max