Recording video with OMX

Does anybody know of a way to record a ofPixel/ofTexture using OMX encoding? Ive been looking at the ofxRPiCameraVideoGrabber addon for clues but am struggling with OMX.

1 Like

@jvcleave did you have an example working that does this?

yeah - it’s off by default but should work

Thanks both for the quick response.

Yeah I’ve got those examples running great. However ,is it possible to record something from the screen like an ofTexture rather than the camera.
I’ve used ofxVideoRecorder on the mac which works really well, but ffmpeg is being a bit tricky on the pi. Also wanted to utilise the hardware acceleration on the raspberry pi.

@david_haylock did you find any solution to this? I’d also like to find an ofxVideoRecorder equivalent for rPi. I’m looking at ofxMovieEncoder, but I’d rather not have to compile static libs for libav for raspberry pi.

@jmarsico I believe @jvcleave had OMX based video recording working …

Hi @jmarsico I did but it moved away from the Raspberry Pi and OMX, used an Intel NUC and captured low res gifs as well as temporary Video Buffers, playing the images back was far more important rather than the physical file. Need to look at OMX again though.


static libav libs are here as well

@jvcleave, thanks for these. Was the OMX video recording straight from the rPiCamera, or was it possible to use OMX to record an ofPixels?

you could also use gstreamer, a pipeline using appsrc allows you to push pixels and save them using omxench264 for example which will use omx to encode to h264. i haven’t tried this so the syntax might be slightly different or have errors, specially not sure about the name of the omx encoding element. you can check it in a terminal with gst-inspect-1.0 | grep omx which will list all the available omx elements:

ofGstUtils gst;
GstElement* pixelssrc;

//.cpp setup
gst.setPipelineWithSink("appsrc name=pixelssrc ! video/x-raw,format=RGB,width=640,height=480 ! omxench264 ! qtmux ! filesink location=data/video.mp4 name=sink");
pixelssrc = gst.getElementByName("pixelssrc");

unsigned char * rawpixels = new unsigned char[pixels.size()];
memcpy(rawpixels,pixels.getPixels(),pixels.size()); //or .getData() if you are using master
GstBuffer * buffer = gst_buffer_new_wrapped (rawpixels,pixels.size());
GstFlowReturn flow_return = gst_app_src_push_buffer((GstAppSrc*)pixelssrc, buffer);
if (flow_return != GST_FLOW_OK) {
    ofLogError() << "error pushing video buffer: flow_return was " << flow_return;

then whenever you want to close the file:

gst_app_src_end_of_stream (pixelssrc)
gst.close(); // just to be sure but shouldn't be necesary

this is allocating memory and making a copy each frame which might slow things down a bit or a lot if you are using big image sizes. you can avoid it by using a pool of ofPixels and gst_buffer_new_wrapped_full but it’s a little bit more complex

@jmarsico Mine is just recording from the camera.

I think if I were going to write an pixels->recording app today I would look at MMAL as opposed to OpenMax. I’m still trying to sort it out but you can see some of the reasoning starting here.

Finally got around to trying this. I ended up using OpenMax as I like it better

Works pretty well on an RPI2 - was recording 720@30fps - app stayed around 20-40fps while recording. Could probably optimize a bit more with threading (at least for the file writing). glReadPixels is still the likely bottleneck