Why is ofSetFrameRate not accurate?

Why is it that when I use ofSetFrameRate(), it actually sets the framerate to ~2x the integer I give it?

EX: I use ofSetFrameRate(60) and the framerate gets set to ~120-125fps. Is this how it’s supposed to work?

Thanks.

What version of openFrameworks are you using? What operating system are you using? What hardware are you using? Are you manually calling update() anywhere?

I’m using the latest oF, Windows 7 x64. Hardware? What kind of hardware?

Also update is only being called in the pre-built files (the ones made by the generator). I didn’t call it anywhere else.

Super – just to be clear you are using the latest release 0.8.4, not the master branch? And are you compiling with codeblocks / mingw32 or Visual Studio 2012? Hardware – what kind of graphics card – integrated? Or dedicated, model, etc?

Sorry, there are lots of variables :smile:

Yeah, 0.8.4. I am using codeblocks with MinGW 4.8.1(I think? Whatever the latest MinGW is on windows). My graphics card is dedicated – The AMD Radeon HD 6950.

I have this issue with almost every OF/Windows 7 project- it’s a system timer issue. Here’s the fix: emptyExample running at 50fps in release mode

Hello,

I tried the fix (the timebeginperiod and timeendperiod thing, right?) but it didn’t actually change anything. Same exact FPS (double the inputted number, unless its over 90).

Hmmm, maybe it is a different issue- have you tried with VS2012? Out of interest, what happens when you don’t set the frame rate but have ofSetVerticalSync(true) on?

setting V-Sync seems to be doing literally nothing (I get about 4k fps with just v-sync on and not locking the framerate)

side note: the cap works fine on my laptop. the difference between my desktop and laptop is basically everything (laptop - intel cpu, nvidia graphics. desktop - amd cpu, amd graphics). Its the same code but the frame rate works fine there.

EDIT: another note - it does the same thing in debug and release mode