Hi, what is the correct way to measure how long each frame is taking to update & draw? I tried just comparing ofGetElapsedTimef() at the start of the update method with what it is at the end of draw but the results didn’t seem to match up with the reality.
I’m hoping to just get a simple ms readout to help test optimizations.
there’s a function that will return that,
ofGetLastFrameTime() if you have vsync enabled or a fixed framerate then the time OF is waiting is also take into account in that function.
If the app is hitting the target framerate then the result of ofGetLastFrameTime() seems to just be the amount of time between frames rather than the actual time it’s busy doing stuff right? eg, at 60fps it always returns around 16ms regardless of much of that time it actually needs to render.
yeah that’s what i meant. you can’t just measure the time gpu takes to draw by using the cpu clock since the gl calls happen somehow asynchronously so the easiest way is to disable vsync so the application runs as fast as possible and measure the framerate. the other option is to use openGL queries
Just for the record… there is a beautiful addon for time measurements:
Thanks for the tips! Following your last comment @arturo I found this article which seems to be what I’m after – will keep investigating…