I want to use a 60fps Ximea camera, do some computer vision stuff and project on screen with minimal latency, the goal is to see camera images almost in real time
Concerning camera frame capturing, I wonder if I should:
- use an ofThreadedChannel to capture frame
- make an other app and send the frame through Syphon to my app
Using syphon makes my app more flexible
But I’m not sure what are benefits and drawbacks from each method,
any help please ?