ofGetElapsedTimeMillis() and speed test

I am doing some measurements to see if certain things I do make performance better or worse. I am using ofGetElapsedTimeMillis() to acheive this:

float begin = ofGetElapsedTimeMillis();  
//do stuff here  
float total = ofGetElapsedTimeMillis()-begin;  
printf("it took %f ms.\n",total);  

How accurate is this? Is this the best way to go?
Also how do I get frames per second?



you can get the framerate with ofGetFrameRate();

you might get more accurate results if you check the ofGetElapsedTimeMillis() not within one frame, but over say 100 frames - of course then the time will include other operations as well (drawing etc.), but if you just want to compare different ways of doing things it should work…

void update() {  
   static float begin = 0; // declare this as static, or class variable so preserves value across  frames  
   if(ofGetFrameNum() % 100 == 0) {  
      float total = ofGetElapsedTimeMillis() - begin;   
      printf("it took %f ms. to do 100 frames\n", total);  
      begin = ofGetElapsedTimeMillis();  
   //do stuff here   

ofGetTimeElapsedMillis isn’t accurate enough to do useful profiling.

are you on Windows, or OSX? if OSX i can send you my FTime and FProfiler classes, which let you do things like go PROFILE_THIS_FUNCTION() at the top of the function, or
… code …

and have the results spat out to the console as a nice nested table…

if you’re on windows, try QueryPerformanceCounter: http://www.decompile.com/cpp/faq/windows-timer-api.htm

I’m on osx. I would be interested.