ofSetFrameRate() & ofGetFrameRate()

hi all 8)

i’m running an application in of0.04, which if i run it without setting ofSetFrameRate and no VerticalSync it runs aproximately at 55 fps in my computer.

why if i set ofSetFrameRate to 25, the displayed value of ofGetFrameRate gives me a value of 19 ? is that as it should be ?

what i understand is that if my application can run at 55fps why i could be able to fix it to run at 25 ? what i’m arguing wrong ??

thankx !


hmm - I’m not entirely sure, but are you positive you are not running with vertical sync enabled in your graphics card (even if it’s disabled in code) ? for example, graphics card setting override application settings, so you could be in vsync, in which case we have less conrol over frame rate (Since it’s based on a multiple of the vsync).

otherwise, I’d experiment. if you up the ofSetFrameRate, does the result from ofGetFrameRate change?

also, the frame rate is smoothed out - if you let it run for a while, does it get up to 25?

on my computer, ofSetFrameRate and ofGetFrameRate are tight - I say 25, and I get 25, I say 44 and I get 44. I’m not exactly sure why yours is off, or how to diagnose it.

again, I’d check your vsync on the *card* , since this overwrites the app settings.

take care,

ps : moved to bugs, because I don’t think this is a usage question…

hi zach & all :wink:
and yes … if i upp the ofSetFrameRate , then ofGetFrameRate upps again …
right now i’ve set it at 44 and played at 25 .

* i conside the smoothing ramp :wink:

which computer are u running this and which app ??

lots of thanks !


ok -

is it consistently off ? ie, if you say 25 and get 19, then if you saw 45, do you get 39, or is proportionately off?

for me, I get always the right framerate, in every app, in xp / nvidia.

can you try looking for a #define in ofConstants called “experimental timing”
you can try turning that on (it will mean you need to clean and recompile OF in your app). Also you should be careful to quit only using “ESC”, not ALT-F4 or closing the console. This experiment code sets the precision of the timer in windows to be much more precise, and maybe that could help in making the sleep command more precise. Hitting esc will set the timer back to normal, since there could be side effects (it’s a system level setting that effects other apps, etc)

hope that helps

hi zach

* yes it’s consistently off …

i tried the experimental precision timer and seems like it’s working on the same way …

again with no ofSetFrameRate … i score 52 fps .
with ofSetFrameRate at 45 i score 25 fps … ?¿ mmm
i’ll try to look out in another machine next week how does this behave, but it’s strange isn’t it ? why it’s not able to climb up fps if it definetly can …

now it issues me a colateral question … what’s the behaviour when u select via ofSetFrameRate a frameRate that could not be reached by performance limitation ? it just goes as fast as it can ?


yes it’s consistently off …

no it doesn’t sound consistently off (ie off by a constant amount). it sounds like it’s off some other way. strange because we haven’t seen this before.

I’ll take a look at the code –

what’s the behaviour when u select via ofSetFrameRate a frameRate that could not be reached by performance limitation ? it just goes as fast as it can ?

yep -

if (diffMillis > millisForFrame){  
	; // we do nothing, we are already slower than target frame  

then we don’t sleep at all

hope that helps –

can you try swapping in this code in ofAppGlutGlue.h :

void idle_cb(void) {  
	printf("-------------------------- a: %i \n", ofGetElapsedTimeMillis());  
	if (nFrameCount != 0 && bFrameRateSet == true){  
		diffMillis = ofGetElapsedTimeMillis() - prevMillis;  
		if (diffMillis > millisForFrame){  
			; // we do nothing, we are already slower than target frame  
		} else {  
			int waitMillis = millisForFrame - diffMillis;   
			#ifdef TARGET_WIN32  
				Sleep(waitMillis);			//windows sleep in milliseconds  
				usleep(waitMillis * 1000);	//mac sleep in microseconds - cooler :)  
	prevMillis = ofGetElapsedTimeMillis();  
	printf("-------------------------- b: %i \n", ofGetElapsedTimeMillis());  

and let me know what it says ? just a snippet from the console - ie,

-------------------------- a: 615  
-------------------------- b: 651  
-------------------------- a: 665  
-------------------------- b: 701  
-------------------------- a: 716  
-------------------------- b: 751  
-------------------------- a: 768  
-------------------------- b: 800  
-------------------------- a: 817  
-------------------------- b: 850  
-------------------------- a: 865  
-------------------------- b: 900  
-------------------------- a: 915  
-------------------------- b: 950  

this gives me some sense about the amount of time between idle calls, and if we need to factor that into the amount of time to sleep.

alternatively we could also alter this sleep amount dynamically, based on the calculate FPS… I wonder if this is a problem for other folks…


hi again !

with the #define WIN32_HIGH_RES_TIMING turned on i swap your code in ofAppGlutGlue.h …

and the series starts as follows :

a : 1361
b : 1362

a : 1383
b : 1384

a : 1386
b : 1427

a : 1434
b : 1468

a : 1474
b : 1512

a : 1516
b : 1555


a : 19845
b : 19883

a : 19889
b : 19927

:wink: … how are my numbers then ??

merci !


hmm -
nothing revealing…

is this an empty app, or one that is doing something? can you try an empty (non-drawing app) and see if you get similar results? or a simple example, like the graphics example?

at this point I’m not sure what to tell you, need to try some other machines out. also can you try another machine and let me know if you learn something?


hi :wink:


can you tell me what your app is using :

video, sound, etc ?

anything threaded (video grabber, for example) might be throwing us off. I hadn’t considered that.

based on what you are saying, I believe there might be an error in calculating how long to sleep – it must be getting thrown off by a sepeate thread, which is also taking time? (thus slowing you down) that’s my guess at the moment.

if you comment out the threaded things (video grabbing, fmod, etc) does your framerate match setFrameRate () ? if so, then we can work out a fix for that -


hola z !

so yes … my app basically uses : video grabbing + shaders and it could use some rtAudio buffering aswell …
if i take video-grabbing out of the app … there’s no app :wink:

is there another way to achieve that ?

the only options i see now are :

1/ to display the framerate and let the user correct the value passed to setFrameRate by keyboard.

2/ the other option is to make an auto-correction function which sets a framerate, then reads it and then correct the value passed passed to setFrameRate to get 25 … any chance that this will work ?

any other options ? i don’t quite understand why threaded functions make the ofFrameRate to be inaccurate, but is there an ellegant way of correcting it despite my two options ?

merci bKu


hi eloi

looking over the code, it appears that the bug is in OF – I don’t have time to code a solution at the moment, but it appears that we are only timing the time it takes to update and draw in order to figure out how much to slow down. I tried hacking it to time the whole frame time, but this only led to stranger problems and it got worse rather then better.

what I’d recommend you do it autocorrect for now:

float targetFrameRate;  // in h  
int framecounter;  
targetFrameRate = 25; // in setup()   (or 50, whatever you want)  
framecounter = 0;  
// in update -- you probably don't want to do this everyframe so I've added a counter.   
// you'll likely want to play with %20 and 0.5f to see if they can help you get better results.   I wouldn't call ofSetFrameRate() very often, just often enough to get results.  
if (framecounter % 20 == 0){  
	if (ofGetFrameRate() < 25){  
              // too slow, so let's speed up  
	      targetFrameRate += 0.5f;  
	      ofSetFrameRate(targetFrameRate );  
	} else {  
             // too fast, so let's slow down  
	     targetFrameRate -= 0.5f;  
	     ofSetFrameRate(targetFrameRate );  

hope that helps - we’ll get a fix in for that as soon as we can –

take care,

hi zach :wink:

i’m happy to know it wasn’t some kind of error on my programming which was giving the differences …

if i can give a hand trying something or looking for a solution let me know, i’m happy also to collaborate on it :wink:


cool – the first thing would be to try the code I sent and see if it’s helpful. it’s a workaround, but I think that it could be helpful and I’m curious to know if it is.

I’ll keep trying to fix the main OF code for that – I think it’s not that hard, but I just couldn’t get it right away.

take care

hi zach

i tested your code .

it mainly does the job but it’s always jumping between 24 and 25 …

alternating maybe more in 24 fps then in 25 .
it’s usefull as it works, but it’s a bit unstable as it’s a workarround.

i’ll investigate if i find some other workarround which could make it more stable somehow …


it mainly does the job but it’s always jumping between 24 and 25 …

as you alter the values %20, and 0.5f, you can get it to jump less, be more responsive, etc. It’s a pretty decent workaround for the time being – you can reduce it’s jumpiness by moving in smaller steps or more often, etc. Anyway, frame rate is fluid – even if it’s running at 30fps, it varies, even without this hack.

I am trying to alter the timing code, but still no luck at the moment… I see the problem but no solutions working yet…


hi eloi

I just went back to try to fix this, but I can’t seem to recreate the problem :frowning:

can you please post a simple example (src+data) that shows this problem ? I’ve tried adding all kinds of threaded things (video playing, grabbing, sounds, etc) and I can’t get my timing off. I’d like to see if we can fix it.


I hate to bring up buried posts, but has there been any development on this before I try the workaround posted?

I get even more drastic results. If I set framerate to 0 (or nothing) I can get my application to run at 800fps. If I set it to 60fps, it (ofGetFramereate) shows 45-50fps. If I up the framerate to 200 knowing that I can get 800fps, I get an output around 120. So I’m getting very differentiating results. It seems the higher I raise the fps, the more off it gets. I’m doing pretty much all image processing and tracking.

Zach, should I try your workaround or have you worked on this since your last post?

hi –

I am fairly sure that this is fixed in 0.05 via the patch on this forum.


can you please make a demo app that requires nothing complicated to run (ie, no image processing or camera) that illustrates the problem. I am happy to take a look. I am unable to recreate this.

more questions, what platform? what compiler? does your graphics card have vertical sync enabled or disabled?