Hello! I’m using gstreamer on windows to try to playback an rtsp stream (WITH AUDIO). So far everything works, but the latency is something like 2-3 seconds, which isn’t acceptable for the application I’m trying to build. I need to get it under 1 sec if possible.
I noticed this thread https://forum.openframeworks.cc/t/network-camera-latency-using-gstreamer-in-windows/17461 about latency, but wasn’t able to figure out how to actually get to any of those settings.
Any suggestion of where to start messing with the pipeline settings? I’m using the basic gstreamer example IE:
ofPtr<ofGstVideoPlayer> gPlayer =ofPtr<ofGstVideoPlayer>(new ofGstVideoPlayer); player.setPlayer(gPlayer); player.loadAsync(MY_RTSP_URL); player.play();
So, it turns out I wasn’t understanding how to manually set a pipeline with gstreamer. This example from @sphaero shows how: https://github.com/sphaero/GentleValleyStream/tree/master/ofApps/CustomGstream/src
The basic idea is that you need to create an ofGstVideoUtils object (NOT ofGstVideoPlayer), and then create the pipeline manually, something like this:
gst.setPipeline("rtspsrc location=rtsp://rtspurl channel=0;stream=0;user=system;pass=system; latency=0 ! decodebin ! videoconvert ! queue2 max-size-buffers=2" , OF_PIXELS_RGB, true, 320, 240); gst.startPipeline(); gst.play();
Gstreamer is immensely complicated and weird and I’m still attempting to grok it- the above pipeline has no audio, but sets the latency nicely to 0. using the handy command line ways of starting pipelines to test I got what I wanted like this:
gst-launch-1.0 rtspsrc location=rtsp://rtspurl latency=0 name=src src. ! decodebin ! videoconvert ! autovideosink sync=true src. ! decodebin ! audioconvert ! autoaudiosink sync=true
BUT I can’t figure out how to get a corresponding pipeline into the app: the auto sinks make a new window, when what I really want is to get things running within OF. I will update this post when I figure it out (or if anyone has tips on getting audio back in in OF let me know!)