how to set the right video frame rate

Hi guys,

this is my first message on openFrameworks forum. I think that openFrameworks is a great work, it helps you to create advanced applications with a few (relatively) simple code lines! Well, now I’m getting into my problem.

I am building a simple application that has to record some data from various kind of source devices, and try to synchronize the information caught from everyone of them; one of the devices is a webcam, the others write on some serial ports. I’m using the videoGrabber in order to grab video frames and display them on my application window, and I’m using the memoryMovieMaker in order to save the frames to a file. As I said, I need synchronization with other devices, so I also write a video log file which contains the grabbed frame counter value and the relative timestamp (expressed in milliseconds).

But when I look at the result I have, for instance, a 36 seconds video file, playing at 30 fps, while my video log file says that the recording lasted much more time! (more than 50 seconds) and the real video frame rate was so about much less. Can I set the right video frame rate to the video file? Is there another way to do the same things best?

Please help! :expressionless: Hope to have been clear, thanks in advance.

Hi.

I discovered that memoryMovieMaker (and qtVideoSaver, too) makes use of the method AddMediaSample of the qt libraries. This method allows me to set the frame duration for each frame, expressed, for instance, in the 600 units default time scale. So, whenever my app gets a new frame from my webcam (videoGrabber.isFrameNew()), it calculates how much was the duration of the last frame, and set it with addFrame(frame, frameDuration) method of memoryMovieMaker class. Anyway, when I record a long .mov file and then I play it in QuickTime, my video log file says the recording lasted for some minutes, but the video file lasts some minutes and some seconds more.

Shouldn’t this one be the right solution? Any ideas? Thanks,

This could be a 29.97 vs 30 fps issue. Most webcams run at exactly 30 fps, but encoders like qt generally default to 29.97. Try diving the longer time by 1.001, and if you get the shorter time – then that’s the issue.

Hi Kyle,

thanks for your reply. I set the frameDuration parameter as

  
(thisFrameTime - previousFrameTime) * 0.6f  

where the duration of the (previous) frame is expressed in milliseconds, and 0.6 is the conversion factor from milliseconds (1000 units per second) to the default Qt time scale (600 units per second). What am I missing? :roll: Thank you in advance.

Hmm, I’m not sure that’s right.

The first thing I would do is check the file that is output, and see how many frames it has (check in VLC or another player that allows you to see this kind of information).

Once you know how many frames it has, compare that to the length of time you expect it to be… and that should give you your framerate. If the framerate doesn’t quite make sense, or using that framerate doesn’t give you a file of the right length in seconds… post back here and it will be easier to figure out what went wrong.

Hi again,

thanks for you support. I recorded a short sample; both my ASCII video log and Quicktime Player say that 712 frames were grabbed, while my ASCII video log says the sample lasts for 25.73 seconds, so one single frame lasts on average about 0.036 seconds (so that 27.78 is my frame rate). Now, in order to set the correct video frame duration, I have to convert that value in milliseconds to Quicktime time-scale units (600 per seconds); so my previous method could be wrong. I think that 36 milliseconds is equal to 36/1.6667 time-scale units, where 1.6667 is the duration of a single unit, 1000/600. Do you think that 36/1.6667 is the correct frame duration parameter?

Yes, I think that is correct. I would have done it differently: (600 qt units / second) * (25.73 seconds / 712 frames) = 21.68 qt units / frame

Unfortunately it looks like your camera isn’t running at a multiple of qt’s 600 units/second, so you’re going to have to make most frames 22 units long and some frames 21 units long.

Yep, it could be due to some approximation error. So do you think that I should change time-scale someway (for instance to 1000 time units) or buy a constant recording rate camera?
:slight_smile:
I know that in the Quicktime libraries there’s a function that can set the video time scale, it is called ‘SetMediaTimeScale’. But I really don’t know where to put it in my code. And finally,

(600 qt units / second) * (25.73 seconds / 712 frames) = 21.68 qt units / frame

…can you explain it, please? Thank you, Kyle.

Alex

If you have spare change to get a nicer camera, that’s probably going to help. Unless, of course, it’s your computer dropping frames rather than the camera running at a non-standard fps.

If you need an approximation for short bursts of video, setting every other frame at 22 and 21 ms will gives you something that is approximately the right length. Otherwise you can just keep a running counter of how many frames you recorded, how much time has passed, and what the total length of the recorded video is… and use that to determine the length of the current frame.

The only reason I wrote the math out for calculating the fps is that I thought it was clearer than saying “36/1.6667”… but if it’s not clearer to you, don’t worry about it :slight_smile:

Fine. I will take in account your useful suggestions. Thanks again for your help.

Alex