Hi List,
what's a good setup for a decent framerate (50, 60 fps or even higher?) HD live capture from Gem? Is it even possible?
A related problem is that ideally I would like to have Gems framerate be driven by the capturing, so that I'm sure not to have missed frames or have captured the same frame twice. Is there a solution to that?
m.
Hi List,
what's a good setup for a decent framerate (50, 60 fps or even higher?) HD live capture from Gem? Is it even possible?
A related problem is that ideally I would like to have Gems framerate be driven by the capturing, so that I'm sure not to have missed frames or have captured the same frame twice. Is there a solution to that?
I would love to learn about is as well! Have you tried searching for SDI on the mailing list archive?
On 09.03.21 07:05, Peter P. wrote:
- Max abonnements@revolwear.com [2021-03-09 02:01]:
Hi List,
what's a good setup for a decent framerate (50, 60 fps or even higher?) HD live capture from Gem? Is it even possible?
A related problem is that ideally I would like to have Gems framerate be driven by the capturing, so that I'm sure not to have missed frames or have captured the same frame twice. Is there a solution to that?
I would love to learn about is as well! Have you tried searching for SDI on the mailing list archive?
I had a glance. I was hoping to get less general advice, preferably concrete proven setups running on Linux. I see there are decklink drivers, are there users out there who can report achieved framerates and latencies from Gem? Which sensor module are they using?
Anyone has a DeckLink Duo 2 Mini for example?
m.
Hi All,
I have made an interactive squash system with Pd based on Basler's high speed cameras for machine vision. I use mainly above 200fps of 5 USB-3 cameras each of them on a separate USB bus. The balls can fly with a 200km/s speed and I hardly have a problem analyzing their position with pix_movement. Surely the biggest problem is having a display with a similar fresh rate, but if you do not need to render the image then it is fine. The computer is a 3GHz processor and 32MB RAM with Ubuntu 19.04 if I remember well.
Best, Popesz
On Tue, Mar 9, 2021 at 11:07 AM Max abonnements@revolwear.com wrote:
On 09.03.21 07:05, Peter P. wrote:
- Max abonnements@revolwear.com [2021-03-09 02:01]:
Hi List,
what's a good setup for a decent framerate (50, 60 fps or even higher?)
HD
live capture from Gem? Is it even possible?
A related problem is that ideally I would like to have Gems framerate be driven by the capturing, so that I'm sure not to have missed frames or
have
captured the same frame twice. Is there a solution to that?
I would love to learn about is as well! Have you tried searching for SDI on the mailing list archive?
I had a glance. I was hoping to get less general advice, preferably concrete proven setups running on Linux. I see there are decklink drivers, are there users out there who can report achieved framerates and latencies from Gem? Which sensor module are they using?
Anyone has a DeckLink Duo 2 Mini for example?
m.
Pd-list@lists.iem.at mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list
If you wanna see some example videos, download the Rayaction squash app, register (preferably with email as some functions are buggy still) and check out the video under each exercise.
On Tue, Mar 9, 2021 at 11:39 AM Csaba Láng langcsaba@gmail.com wrote:
Hi All,
I have made an interactive squash system with Pd based on Basler's high speed cameras for machine vision. I use mainly above 200fps of 5 USB-3 cameras each of them on a separate USB bus. The balls can fly with a 200km/s speed and I hardly have a problem analyzing their position with pix_movement. Surely the biggest problem is having a display with a similar fresh rate, but if you do not need to render the image then it is fine. The computer is a 3GHz processor and 32MB RAM with Ubuntu 19.04 if I remember well.
Best, Popesz
On Tue, Mar 9, 2021 at 11:07 AM Max abonnements@revolwear.com wrote:
On 09.03.21 07:05, Peter P. wrote:
- Max abonnements@revolwear.com [2021-03-09 02:01]:
Hi List,
what's a good setup for a decent framerate (50, 60 fps or even
higher?) HD
live capture from Gem? Is it even possible?
A related problem is that ideally I would like to have Gems framerate
be
driven by the capturing, so that I'm sure not to have missed frames or
have
captured the same frame twice. Is there a solution to that?
I would love to learn about is as well! Have you tried searching for SDI on the mailing list archive?
I had a glance. I was hoping to get less general advice, preferably concrete proven setups running on Linux. I see there are decklink drivers, are there users out there who can report achieved framerates and latencies from Gem? Which sensor module are they using?
Anyone has a DeckLink Duo 2 Mini for example?
m.
Pd-list@lists.iem.at mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list
On 09.03.21 11:39, Csaba Láng wrote:
Hi All,
I have made an interactive squash system with Pd based on Basler's high speed cameras for machine vision. I use mainly above 200fps of 5 USB-3 cameras each of them on a separate USB bus.
Very cool.
Do you know which model? https://www.baslerweb.com/en/products/cameras/area-scan-cameras/
I see USB3 vision is the standard. That works out of the box with Gem? What I read here sounds promising: https://en.wikipedia.org/wiki/USB3_Vision
I remember GigE which was terribly hampered and one could download an SDK after signing an NDA, but even with that SDK I wasn't able to get it into Gem.
Would something like this work equally: https://www.alliedvision.com/en/products/embedded-vision-cameras/detail/Alvi...
Also I think I have this frame grabber here in a different machine: https://www.baslerweb.com/de/produkte/framegrabber-portfolio/framegrabber/mi...
How big are my chances that I can get images from Gem with that card on Linux?
Max
Hi,
Usually the USB3 industrial cameras use iIDC protocol which comes from legacy Firewire cameras. IIDC is also implemented on some USB2 cameras (like Point Grey Chameleon in its first version). And IIDC is supported in Gem through the libdc1394 plugin. Some vendors implement special features available only through their own SDK but I never need to go that route.
Concerning the GigE protocol, there is an open source implementation with Aravis which is not directly integrated into Gem afaik. Put there is a Pylon plugin which is vendor specific and you might also access GigE camera with Gstreamer. So at least you should be able to write a pipeline to forward GigE frames to Gem via v4l2loopback on Linux. But I never did that and to be honest if I have to work with GigE camera in Gem, I'd probably write a new plugin if needed instead of trying to setup Gstreamer. The main advantage of GigE over IIDC is the cable length. With USB and Firewire it's hard to work with cable more than 10m long, especially if you target high framerate / resolution (which means high bandwidth). GigE pushes this limitation to hundreds meters (depending on the cable quality and RF environment).
Concerning triggering the Gem render pipeline with the camera itself, I'm not sure it is possible out of the box. Most of the video plugin uses polling to get frames : a thread runs in the background to retrieve frames from the camera and then when the [pix_video] receives a gemstate from a gemhead it checks if there is a new frame and copy it to the Gem thread to use it. So if you want to trigger the gem render pipeline with a camera frame you probably need to write a custom external that waits for a new frame (and thus blocks the whole Pd instance). Another solution is to do the reverse : trigger the camera upon Gem's refresh. Most of industrial camera have a trigger input to release the (electronic) shutter. With some hardware hack you would be able to take an image with the camera when you need it. But then you might need to wait for the shutter to close and then for the image to be processed and sent by the camera. That's not tricky.
Syncing the image processing pipeline on a camera frame is not that easy in any case. Even if you write all the code from scratch in C++ for example (which I did).
Hope that helps
Cheers
Antoine
Le mer. 10 mars 2021 à 00:07, Max abonnements@revolwear.com a écrit :
On 09.03.21 11:39, Csaba Láng wrote:
Hi All,
I have made an interactive squash system with Pd based on Basler's high speed cameras for machine vision. I use mainly above 200fps of 5 USB-3 cameras each of them on a separate USB bus.
Very cool.
Do you know which model? https://www.baslerweb.com/en/products/cameras/area-scan-cameras/
I see USB3 vision is the standard. That works out of the box with Gem? What I read here sounds promising: https://en.wikipedia.org/wiki/USB3_Vision
I remember GigE which was terribly hampered and one could download an SDK after signing an NDA, but even with that SDK I wasn't able to get it into Gem.
Would something like this work equally:
https://www.alliedvision.com/en/products/embedded-vision-cameras/detail/Alvi...
Also I think I have this frame grabber here in a different machine:
https://www.baslerweb.com/de/produkte/framegrabber-portfolio/framegrabber/mi...
How big are my chances that I can get images from Gem with that card on Linux?
Max
Pd-list@lists.iem.at mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list
When I started this project Gem used the pylon 2 backend. The new cameras of basler are based on pylon 5. Iohannes that time updated the backend in Gem to 5. You have to install pylon 5 before compiling Gem showing the path to pylon. And that’s it. GigE is not a good solution as the data is more than GigE can handle. I use usb3 hybrid cables up to 50m. By the way I use this camera:
https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca1440-...
On Wed, 10 Mar 2021 at 10:05, Antoine Villeret antoine.villeret@gmail.com wrote:
Hi,
Usually the USB3 industrial cameras use iIDC protocol which comes from legacy Firewire cameras. IIDC is also implemented on some USB2 cameras (like Point Grey Chameleon in its first version). And IIDC is supported in Gem through the libdc1394 plugin. Some vendors implement special features available only through their own SDK but I never need to go that route.
Concerning the GigE protocol, there is an open source implementation with Aravis which is not directly integrated into Gem afaik. Put there is a Pylon plugin which is vendor specific and you might also access GigE camera with Gstreamer. So at least you should be able to write a pipeline to forward GigE frames to Gem via v4l2loopback on Linux. But I never did that and to be honest if I have to work with GigE camera in Gem, I'd probably write a new plugin if needed instead of trying to setup Gstreamer. The main advantage of GigE over IIDC is the cable length. With USB and Firewire it's hard to work with cable more than 10m long, especially if you target high framerate / resolution (which means high bandwidth). GigE pushes this limitation to hundreds meters (depending on the cable quality and RF environment).
Concerning triggering the Gem render pipeline with the camera itself, I'm not sure it is possible out of the box. Most of the video plugin uses polling to get frames : a thread runs in the background to retrieve frames from the camera and then when the [pix_video] receives a gemstate from a gemhead it checks if there is a new frame and copy it to the Gem thread to use it. So if you want to trigger the gem render pipeline with a camera frame you probably need to write a custom external that waits for a new frame (and thus blocks the whole Pd instance). Another solution is to do the reverse : trigger the camera upon Gem's refresh. Most of industrial camera have a trigger input to release the (electronic) shutter. With some hardware hack you would be able to take an image with the camera when you need it. But then you might need to wait for the shutter to close and then for the image to be processed and sent by the camera. That's not tricky.
Syncing the image processing pipeline on a camera frame is not that easy in any case. Even if you write all the code from scratch in C++ for example (which I did).
Hope that helps
Cheers
Antoine
Le mer. 10 mars 2021 à 00:07, Max abonnements@revolwear.com a écrit :
On 09.03.21 11:39, Csaba Láng wrote:
Hi All,
I have made an interactive squash system with Pd based on Basler's high speed cameras for machine vision. I use mainly above 200fps of 5 USB-3 cameras each of them on a separate USB bus.
Very cool.
Do you know which model? https://www.baslerweb.com/en/products/cameras/area-scan-cameras/
I see USB3 vision is the standard. That works out of the box with Gem? What I read here sounds promising: https://en.wikipedia.org/wiki/USB3_Vision
I remember GigE which was terribly hampered and one could download an SDK after signing an NDA, but even with that SDK I wasn't able to get it into Gem.
Would something like this work equally:
https://www.alliedvision.com/en/products/embedded-vision-cameras/detail/Alvi...
Also I think I have this frame grabber here in a different machine:
https://www.baslerweb.com/de/produkte/framegrabber-portfolio/framegrabber/mi...
How big are my chances that I can get images from Gem with that card on Linux?
Max
Pd-list@lists.iem.at mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list
Pd-list@lists.iem.at mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list