FPS in Win7 vs Ubuntu 9.1

I just started a small little project to test the base line frame rate in Ubuntu vs Windows, and I must say I am a little surprised as the gap. The only code I have added to the project is just to display the FPS in the testApp::draw() method:

float value = ofGetFrameRate();
char sValue[50];
sprintf(sValue, “%f”, value);
ofDrawBitmapString(sValue, 50, 50);

In Windows 7 I get an average framerate of 75
In Ubuntu 9.1 I get an average framerate of 650

That is a huge difference. Is that normal?

sounds like your windows 7 graphics drivers are limiting the framerate to 75. This is commonly called v-sync, where the graphics card will wait for the display to refresh before redrawing anything.

yes, as tim s says the default behavior for vertical-sync varies from platform to platform, so you can get very different frame rates as by default as openframeworks is trying to run as fast as possible. you can try calling:

  
ofSetVerticalSync(false);   

in testApp’s setup on your pc, and see if that helps. some graphics cards might override the application – check your graphics cards preferences / control panel and look for a vertical-sync feature (called sync, vertical blanking or vsync usually), which might says “on / off / application”. Set it to application (which means the application can decide) or on or off as you like.

the other option, if you’d like to make the applications more “consistent” is to say

  
ofSetFrameRate(60);  

which will make your applications slow down to 60 fps, for example.

hope that helps!

take care,
zach

That was indeed it. Now it says the frame rate is around 2300

I’m still getting this problem on my machine. I did the same thing: add the fps display to an empty example.

I tried adding ofSetVerticalSync(false); and also manually setting it to “force off” in the NVIDIA control panel. It reports about 64 fps.

I’m running Windows 7 on a 64-bit Intel Core i7 processor with 2x SLI’d NVIDIA 460s.

Any ideas?

It looks like my problem is actually ofGetFramerate(). If I write my own FPS timer using ofGetElapsedTimef() I get the proper framerate, while ofGetFramerate() is kind of “locked” to vsync values. I’m going to look into this more…

It looks like my problem is actually ofGetFramerate(). If I write my own FPS timer using ofGetElapsedTimef() I get the proper framerate, while ofGetFramerate() is kind of “locked” to vsync values. I’m going to look into this more…

Good thing I found this one, was cracking my head about it, thought my drivers were not working like they should. Any idea if this is fixable? It’s not a huge deal, but if you need more then 100fps (high speed video tracking) and you see the fps counter stuck at 60-64 in an empty example it’s a pita :slight_smile: