One thing this could be to do with is that guvcview can run at 125fps, but I can’t get openframeworks to run at more than 60. It seems to be hard limited, even if I set framerate explicitly with ofSetFrameRate(160);
Any ideas on how to get openframeworks video capture to work with lower latency?
I think framerate may have something to do with it, but I just can’t work out how to get framerate above 60fps. ofSetFrameRate seems to ignore everything above 60.
also perhaps the gstreamer pipeline is introducing some lag, the gstutils pipeline for the videograbber is setup to be as generic as posible so everything just works but perhaps is not the fastest. you could try to setup a custom pipeline using directly ofGstUtils and setting the pipeline with
and then try to do modifications over that, i would try to get rid of the ffmpegcolorspace element: that will return the pixels as yuv probably and then try to use opencv to do the yuv conversion directly to grayscale instead of doing yuv -> rgb -> grayscale.
i suspect that the two biggest issues are framerate and frame buffers.
1 framerate of the camera, arturo has great info on this
2 framerate of the display. if you can bump it from 60 Hz to 85 Hz you’re chopping off 5 ms
3 frame buffer on the display: digital video has more latency than analog due to the way it’s transmitted and displayed. the lowest possible latency display would be a crt driven by vga. right now you might be losing as much as a whole frame (16 ms) due to frame buffering.
4 frame buffer on the graphics card: i think OF is always double buffered, and doing setBackgroundAuto(false) ought to change it to single buffer, but this hasn’t been done yet. i tried hacking OF to be single buffered a few months ago, but couldn’t get it to work. i know damian stewart has been thinking about/looking into this… this would give you another 16 ms.
also (with massive respect for arturo ) i feel like the alternative color conversion will only give you a few ms max, and there are probably much bigger issues to deal with first.
thanks for the tips! I’ll work through these and post back.
In the mean time, does anyone have any clue why I can’t run higher than 60fps under linux? ofSetFrameRate doesn’t do anything past 60.
I just checked the module stuff, I think it’s out of date now. When I change settings in guvcviewer I can see in kern.log the frame rate being changed.
Even though I have set framerate to 125 in vidGrabber.initGrabber(320, 240,125);
Anyway this won’t get me the output framerate, because I can’t get this above 60.
do you mean fps in the camera or the screen? ofSetFramerate sets the framerate for the screen if you cannot go higher than that take a look at the settings of the video card, it’s probably fixed to the v-sync rate of your screen. For the camera take a look at my previous post
I mean the framerate of the video output, I can’t get it above 60. Even if my video card can’t take it, surely I can be allowed to have update() called faster than 60fps? This gets me 60fps.
no but you can use a thread, also if you are not doing any analysis where you need to go faster than the drawing rate is usually not very useful to do it. the videograbber is already running in it’s own thread so if you cannot catch up with it in update it will just drop frames and you’ll get the latest.
if you want try threads anyway take a look at the ofxThread example
that would be great! Tho from my code above it’s not to do with the ps3eye, it’s just my install of OF on linux. I can’t get anything to run faster than 60, video capture or otherwise.
And I also tried commenting out the delay code in libs/openFrameworks/app/ofAppGlutWindow.cpp
but no dice.
Maybe the max framerate is hardcoded somewhere? Grepping the code right now!
Just noticed that my monitor can’t do above 60hz. So maybe ofSetVerticalSync(false);
isn’t working and the code is hardwired to the monitor’s refresh frequency?
Another clue! If I minimise the window, the framerate goes up to what it’s meant to be…
Another update: I installed driconf to try and force ofSetVerticalSync false for all applications. Not sure how to tell if it works, but it made no difference to my app ;(
Hey, everything everyone has said is pretty details and spot on. I’m a bit surprised you’re getting the same results from LCD and projector. I"d written this little app a while ago which measures the difference between LCD and projector. Are you sure you’re getting almost same results even when you measure it with this?
I’ll check again with the projector. I haven’t been recording all the numbers and averaging them, so they’ll be off, but when I last tested they were very similar, so maybe within 5 to 10ms of each other.
My main aim now is to get an openframeworks app running at > 60fps. I think that’s key for lower latency. Did you get that 60fps limit sorted with windows? What was it?
the 60fps thing is related to your graphics driver forcing vertical sync. but i don’t think you’re going to get less latency by increasing the fps if your monitor can only draw at that speed. the only thing you are going to do is to do more updates per each real draw to the screen but that frames are not going to be shown since your screen can only do 60fps
I’m stumped as to why guvcview is so much faster than the simplest video in/out I can achieve in openframeworks. I’ll keep going on that pipeline stuff you suggested.
also as Kyle said i will try the color conversion stuff as a last resource, i remember ffmpegcolorspace being really slow but even with that, what’s going to make everything much faster is trying to increase the camera fps. i guess there’s something wrong in the gstUtils initialization if the kernel module thing is not needed anymore. will take a look at it as soon as i have a moment
before initGrabber and see what modes the camera has? i suspect that it’s selecting an rgb mode to avoid the conversion but the framerate is slower than those in yuv
could it be to do with the output side of things? I’ve had a good play with these options, (and tried queue-size=0), but they don’t seem to make much difference. Maybe that extra 50ms is coming from some kind of output buffering?