reducing video latency

Hello all,

this is my first post to the forum, I’m new with openframeworks.

I’m working on a realtime painting program where we track infra red light sources with a ps3eye and opencv findContours

I’m having problems getting a latency lower than 120ms. I’ve documented my measurement method and results here:
https://github.com/mattvenn/measure-video-latency/wiki

The thing that is perhaps most interesting is that with a very straight forward video pipe (no opencv):

ps3eye -> guvcview -> lcd monitor = 50ms
ps3eye -> openframeworks -> lcd monitor = 120ms (code below).

One thing this could be to do with is that guvcview can run at 125fps, but I can’t get openframeworks to run at more than 60. It seems to be hard limited, even if I set framerate explicitly with ofSetFrameRate(160);

Any ideas on how to get openframeworks video capture to work with lower latency?

Thanks!
Matt

  
void testApp::setup() {  
	width = ofGetWidth();  
	height = ofGetHeight();  
	vidGrabber.setVerbose(true);  
	vidGrabber.initGrabber(320, 240);  
	colorImg.allocate(320, 240);  
	grayImage.allocate(320, 240);  
}  
  
//--------------------------------------------------------------  
  
void testApp::update() {  
  
	bool bNewFrame = false;  
  
		vidGrabber.grabFrame();  
		bNewFrame = vidGrabber.isFrameNew();  
  
		if (bNewFrame)  
		{  
			colorImg.setFromPixels(vidGrabber.getPixels(), 320, 240);  
			grayImage = colorImg;  
		}  
}  
  
//--------------------------------------------------------------  
void testApp::draw()  
{  
	grayImage.draw(0,0);  
}  

I found that I got lower latency with FireWire cameras but I haven’t got nearly as accurate information as you :slight_smile:

Seb

Which driver/OS are you using?

oops, sorry to omit such important info!

I’m using linux 2.6.38
openframeworks 0061 (I think, not sure how to tell. I downloaded this: openframeworks-openFrameworks-6651a00.tar.gz)

I’m not sure about driver. When I start the video capture I get this on the output:

  
  
OF_NOTICE: ofGstUtils: selected format: 320x240 video/x-raw-rgb framerate: 100/1  
OF_NOTICE: gstreamer pipeline: v4l2src name=video_source device=/dev/video0 ! video/x-raw-rgb,width=320,height=240,framerate=100/1 !  ffmpegcolorspace ! appsink name=sink  caps="video/x-raw-rgb, width=320, height=240, bpp=24"  
  

I think framerate may have something to do with it, but I just can’t work out how to get framerate above 60fps. ofSetFrameRate seems to ignore everything above 60.

in the linux driver you need to specify the fps when loading the kernel module:

from http://createdigitalmotion.com/2009/08/trick-out-your-ps3-eye-webcam-best-cam-for-vision-augmented-reality/

“sudo modprobe gspca_ov534 videomode=04.”

Here are the available modes:

00: 640×480@15
01: 640×480@30
02: 640×480@40
03: 640×480@50
04: 640×480@60
10: 320×240@30
11: 320×240@40
12: 320×240@50
13: 320×240@60
14: 320×240@75
15: 320×240@100
16: 320×240@125

also perhaps the gstreamer pipeline is introducing some lag, the gstutils pipeline for the videograbber is setup to be as generic as posible so everything just works but perhaps is not the fastest. you could try to setup a custom pipeline using directly ofGstUtils and setting the pipeline with

  
gst.setPipeline("v4l2src name=video_source device=/dev/video0 ! video/x-raw-rgb,width=320,height=240,framerate=100/1 !  ffmpegcolorspace ! appsink name=sink  caps=\"video/x-raw-rgb, width=320, height=240, bpp=24\");  

and then try to do modifications over that, i would try to get rid of the ffmpegcolorspace element: that will return the pixels as yuv probably and then try to use opencv to do the yuv conversion directly to grayscale instead of doing yuv -> rgb -> grayscale.

look for cv::cvtColor:

http://opencv.willowgarage.com/documentation/cpp/miscellaneous-image-transformations.html

i suspect that the two biggest issues are framerate and frame buffers.

1 framerate of the camera, arturo has great info on this
2 framerate of the display. if you can bump it from 60 Hz to 85 Hz you’re chopping off 5 ms
3 frame buffer on the display: digital video has more latency than analog due to the way it’s transmitted and displayed. the lowest possible latency display would be a crt driven by vga. right now you might be losing as much as a whole frame (16 ms) due to frame buffering.
4 frame buffer on the graphics card: i think OF is always double buffered, and doing setBackgroundAuto(false) ought to change it to single buffer, but this hasn’t been done yet. i tried hacking OF to be single buffered a few months ago, but couldn’t get it to work. i know damian stewart has been thinking about/looking into this… this would give you another 16 ms.

also (with massive respect for arturo :slight_smile: ) i feel like the alternative color conversion will only give you a few ms max, and there are probably much bigger issues to deal with first.

thanks for the tips! I’ll work through these and post back.

In the mean time, does anyone have any clue why I can’t run higher than 60fps under linux? ofSetFrameRate doesn’t do anything past 60.

I just checked the module stuff, I think it’s out of date now. When I change settings in guvcviewer I can see in kern.log the frame rate being changed.

  
  
Apr 12 15:39:53 matthewlaptop kernel: [23518.590697] ov534: frame_rate: 125  
  

When I run my program I see this:

  
  
Apr 12 15:40:18 matthewlaptop kernel: [23543.204080] ov534: frame_rate: 60  
Apr 12 15:40:18 matthewlaptop kernel: [23543.285336] ov534: frame_rate: 100  
  

Even though I have set framerate to 125 in vidGrabber.initGrabber(320, 240,125);
Anyway this won’t get me the output framerate, because I can’t get this above 60.

Matt

do you mean fps in the camera or the screen? ofSetFramerate sets the framerate for the screen if you cannot go higher than that take a look at the settings of the video card, it’s probably fixed to the v-sync rate of your screen. For the camera take a look at my previous post

I mean the framerate of the video output, I can’t get it above 60. Even if my video card can’t take it, surely I can be allowed to have update() called faster than 60fps? This gets me 60fps.

  
  
void testApp::setup() {  
    ofSetLogLevel(OF_LOG_VERBOSE);  
    int freq = 250;  
    ofSetVerticalSync(false);  
    ofSetFrameRate(freq);  
}  
  
//--------------------------------------------------------------  
  
void testApp::update() {  
  
		if( timed ++ > 100 )  
		{  
			timed = 0;  
			printf( "fps %f\n", ofGetFrameRate() );  
		}  
}  
  
//--------------------------------------------------------------  
void testApp::draw()  
{  
	  
}  
  

no but you can use a thread, also if you are not doing any analysis where you need to go faster than the drawing rate is usually not very useful to do it. the videograbber is already running in it’s own thread so if you cannot catch up with it in update it will just drop frames and you’ll get the latest.

if you want try threads anyway take a look at the ofxThread example

will take a look at the fps thing with the ps3eye

that would be great! Tho from my code above it’s not to do with the ps3eye, it’s just my install of OF on linux. I can’t get anything to run faster than 60, video capture or otherwise.

I had a look at a similar thread by Memo in 2009
http://forum.openframeworks.cc/t/can’t-go-faster-than-60fps-on-windows-with-vsync-disabled/2410/0

And I also tried commenting out the delay code in libs/openFrameworks/app/ofAppGlutWindow.cpp
but no dice.

Maybe the max framerate is hardcoded somewhere? Grepping the code right now!

Just noticed that my monitor can’t do above 60hz. So maybe ofSetVerticalSync(false);
isn’t working and the code is hardwired to the monitor’s refresh frequency?

Another clue! If I minimise the window, the framerate goes up to what it’s meant to be…

Another update: I installed driconf to try and force ofSetVerticalSync false for all applications. Not sure how to tell if it works, but it made no difference to my app ;(

Matt

Hey, everything everyone has said is pretty details and spot on. I’m a bit surprised you’re getting the same results from LCD and projector. I"d written this little app a while ago which measures the difference between LCD and projector. Are you sure you’re getting almost same results even when you measure it with this?

I’ll check again with the projector. I haven’t been recording all the numbers and averaging them, so they’ll be off, but when I last tested they were very similar, so maybe within 5 to 10ms of each other.

My main aim now is to get an openframeworks app running at > 60fps. I think that’s key for lower latency. Did you get that 60fps limit sorted with windows? What was it?

Cheers,

Matt

the 60fps thing is related to your graphics driver forcing vertical sync. but i don’t think you’re going to get less latency by increasing the fps if your monitor can only draw at that speed. the only thing you are going to do is to do more updates per each real draw to the screen but that frames are not going to be shown since your screen can only do 60fps

OK I think I understand that.

I’m stumped as to why guvcview is so much faster than the simplest video in/out I can achieve in openframeworks. I’ll keep going on that pipeline stuff you suggested.

Thanks everyone for all the input!

Matt

also as Kyle said i will try the color conversion stuff as a last resource, i remember ffmpegcolorspace being really slow but even with that, what’s going to make everything much faster is trying to increase the camera fps. i guess there’s something wrong in the gstUtils initialization if the kernel module thing is not needed anymore. will take a look at it as soon as i have a moment

great, thanks Arturo. Give me a shout on twitter or email if I can give you any more info.

OK, I tried the gstreamer pipeline stuff. First on the command line I got this working

  
  
gst-launch-0.10 v4l2src queue-size=0 ! 'video/x-raw-yuv,width=320,height=240,framerate=100/1' !  xvimagesink  
  

That got me 80ms latency, so about 10ms or so slower than guvcview.

Then I put the pipeline in openframeworks (though I couldn’t get x-raw-yuv to work, so used x-raw-rgb)

This got me the usual 120ms latency.

mmh, can you set log level to verbose

  
ofSetLogLevel(OF_LOG_VERBOSE);  

before initGrabber and see what modes the camera has? i suspect that it’s selecting an rgb mode to avoid the conversion but the framerate is slower than those in yuv

  
  
OF_NOTICE: gstreamer pipeline: v4l2src queue-size=0 ! video/x-raw-rgb,width=320,height=240,framerate=100/1 ! appsink name=sink caps="video/x-raw-rgb, depth=24, bpp=24"  
  

could it be to do with the output side of things? I’ve had a good play with these options, (and tried queue-size=0), but they don’t seem to make much difference. Maybe that extra 50ms is coming from some kind of output buffering?

Matt