Gstreamer -- rtsp -- Axis camera


Since yesterday morning I ve been trying to set up a gstreamer pipeline with OF in order to stream 1920x1080 H264 video through rtsp from an AXIS P1346 camera .

I found really quickly a pipeline that worked with gst-launch-0.10 but I couldnt transfer it to its OF equivalent …

gst-launch-0.10 -v rtspsrc location=rtsp://x.x.x.x:554/axis-media/media.amp latency=1000 debug=1 ! rtph264depay ! ffdec_h264 ! capsfilter caps="video/x-raw-yuv,width=1920,height=1080,depth=12,bpp=12" ! autovideosink  

Finally after a lot of experimentation I managed to create a working pipeline today and I wanted to share it in case anyone else finds himself in the same situation.

Here is what I have:

gst.setPipelineWithSink("rtspsrc location=rtsp://x.x.x.x:554/axis-media/media.amp debug=1 latency=0  ! rtph264depay ! ffdec_h264 ! ffmpegcolorspace ! video/x-raw-rgb, width=(int)1920, height=(int)1080, bpp=24, depth=24 ! appsink name=sink", "sink", true);  

The most important issue that I had is that the returned preroll buffer was in yuv format so half of the buffer allocated with gst.allocate and I needed to convert it to video/x-raw-rgb which is done by linking with ffmpegcolorspace.

Anyhow I hope it will save someone else’s time at some point…




2 questions:
Did you succeed to stream both Video and Audio?
Is it working over the internet?

I plan to do an installation with an Axis camera connected to the internet and an openframeworks application on a remote site. I want to be able to display live video and play live audio from the camera at the remote location.

Do you think it could work at a high resolution if I use a good internet connection (like 2Mbps)?

Thank you for your help,