I’ve got a project coming up where we have to capture PAL video, process it, and output it again as PAL video. We should get as close to realtime as possible.
The concept works with a webcam as input and VGA as output. I noticed however a lot of delay, which made me search the forums here.
There are a lot of threads on the forum dealing with this, eg http://forum.openframeworks.cc/t/reducing-video-latency/5931/0
So; things I did for reducing the delay:
- make a threaded video grabber
- make a threaded video processor (± 30ms/frame)
- main thread draws the result on the screen
Communication between threads is done using triple buffering FBO’s (see here: http://forum.openframeworks.cc/t/simple-particles-system-and-thread/7248/0)
This results in a latency of ± 120ms; which is unacceptable. It seems that OF’s double buffering has something to do with it; as well as the USB webcam input which buffers in itself, and the video output to LCD screen.
The main question: can this be resolved in the case of pal input/output with external hardware? This would bypass the latency in the USB cam, OF’s buffering and the videocard/LCD buffering.
I thought of the blackmagic intensity shuttle (thunderbolt): http://www.blackmagic-design.com/products/intensity/models/
Can this device (or others) be used simultaneously for input and output? If so, I would proceed as follows:
- make a threaded grabber/output that first reads the pixels as soon as avialable
- stores them in the buffer
– processing thread gets to work
- reads in another buffer of the previously processed frame (not the one just stored)
- writing this to the PAL output
Assuming the processing thread always takes less then 40ms/frame (pal = 25fps); this would result in a stable latency of 1 frame. I believe that this would be the theoretical lower limit for the latency.
Any help is very much appreciated!