Network camera latency using gstreamer in windows

I’m trying to display a rtsp stream from a network camera (FullHD, h264 encoded).
I’m using ofxGstreamer on a win7 machine and Code::Blocks.
The camera is working fine and minimal latency with this pipeline:

gst-launch-1.0 -v rtspsrc location=rtsp://192.168.1.4:5504/channel=0;stream=0;user=system;pass=system; latency=0 ! decodebin ! videoconvert ! autovideosink

When I transfer the pipeline in OF, though, I experience a huge latency (> 5 sec), that seems dependent on the PC executing the code, worst on a slow PC.
It’s like the video is not dropping late frames but keeps them in a buffer. I’ve played with many settings in the pipeline (using UDP vs. TCP, use-buffering in decodebin, ecc.) with no result. Using setFrameByFrame(true) just worsen things.
I’ve tried both this code:

ofGstVideoPlayer* gstPlayer = new ofGstVideoPlayer();
vidPlayer.setPlayer(ofPtr<ofGstVideoPlayer>(gstPlayer));
vidPlayer.loadMovie("rtsp://192.168.1.4:5504/channel=0;stream=0;user=system;pass=system;");
vidPlayer.play();

and this one:

gst.allocate(1920, 1080, 24);
gst.setPipeline("rtspsrc location=rtsp://192.168.1.4:5504/channel=0;stream=0;user=system;pass=system; latency=0 ! decodebin ! videoconvert " , 24, true, 1920, 1080);
gst.startPipeline();
gst.play();

Running the video acquisition in a separate thread improves performance, but not the latency.
Is there a method to force to drop late frames and reduce the latency?

not sure why the frames might be accumulating, unless the application can’t keep up. anyway you can always add a queue in the pipeline which will create different thread and drop late frames: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-plugins/html/gstreamer-plugins-queue2.html

the pipeline would be something like:

gst.setPipeline("rtspsrc location=rtsp://192.168.1.4:5504/channel=0;stream=0;user=system;pass=system; latency=0 ! decodebin ! videoconvert ! queue2 max-size-buffers=2" , 24, true, 1920, 1080);

where max-size-buffers is the max number of buffers in the queue before starting to drop. also depending where you put the queue in the pipeline the performance might be better, for example if you move it before videoconvert or even decodebin it’ll drop late frames before they are decoded using less cpu. you can even use more than one queue in the pipeline creating more threads so the decoding and the colorspace conversion can happen in parallel for example

2 Likes

This works like a charm!

Thaks @arturo! :smile:

I’m having the exact same issue and the queue2 element does not work for me.

I’m using ofGstVideoUtils, when I pause the stream using ofGstVideoUtils::setPaused(true) CPU is leveraged but memory starts increasing indefinetly. I’ve tried using:

gst_app_sink_set_drop(GST_APP_SINK(gstSink),1);
gst_app_sink_set_max_buffers(GST_APP_SINK(gstSink),2);

In the ofGstUtils::startPipeline() function and I’ve also tried using the queue2 element in my pipeline, but no luck.

Question: Can I PAUSE the GStreamer pipeline so that it stops handling input data i.e. it drops all data until I resume?


PS: I’ve set the rtspsrc::drop-on-latency element to TRUE and it seems to do the trick… As we can’t use queue elements at the beggining of the pipeline this might be the proper way to solve it (as well as queuing after the rtspsrc too). Still, I’d like to know whether the question above is something feasible.

calling pause on ofGstUtils or ofGstVideoUtils will completely pause the whole pipeline

Hi, I believe it should but it isn’t.

Somehow my rtspsrc keeps buffering, when I resume (ofGstVideoUtils::setPaused(false)) I can actually render all the buffered images and after sometime it catches up with the live stream again.