Custom gstreamer pipelines

Hi all,

Any Gstreamer experts around here? I have been modding the ofGstVideoPlayer class to make it use a custom pipeline instead of playbin. It works however I don’t really like how it performs. In Gstreamer the stream is much more stable and doesn’t smear (is that the right word for h264 artefacts?) I’m not sure how OF implemented Gstreamer but I guess it’s just grabbing pixels from the appsink element?

Basically what I did is change the createPipeline method to use gst_parse_launch and then link the appsink manually at the end.

bool ofCustomGstVideoPlayer::createPipeline(string name){
    
    GstElement * gstPipeline = gst_parse_launch("udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! autovideoconvert name=decode" , NULL);
    //GstElement * gstPipeline = gst_element_factory_make("playbin","player");
    //g_object_set(G_OBJECT(gstPipeline), "port", name.c_str(), (void*)NULL);

    // create the oF appsink for video rgb without sync to clock
    GstElement * gstSink = gst_element_factory_make("appsink", "app_sink");

    gst_base_sink_set_sync(GST_BASE_SINK(gstSink), true);
    gst_app_sink_set_max_buffers(GST_APP_SINK(gstSink), 8);
    gst_app_sink_set_drop (GST_APP_SINK(gstSink),true);
    gst_base_sink_set_max_lateness  (GST_BASE_SINK(gstSink), -1);

    string mime="video/x-raw";

    GstCaps *caps;
    if(internalPixelFormat==OF_PIXELS_NATIVE){
        //caps = gst_caps_new_any();
        caps = gst_caps_from_string((mime + ",format={RGBA,BGRA,RGB,BGR,RGB16,GRAY8,YV12,I420,NV12,NV21,YUY2}").c_str());
        /*
        GstCapsFeatures *features = gst_caps_features_new (GST_CAPS_FEATURE_META_GST_VIDEO_GL_TEXTURE_UPLOAD_META, NULL);
        gst_caps_set_features (caps, 0, features);*/
    }else{
        string format = ofGstVideoUtils::getGstFormatName(internalPixelFormat);
        caps = gst_caps_new_simple(mime.c_str(),
                                            "format", G_TYPE_STRING, format.c_str(),
                                            NULL);
        ofLogWarning() << "caps: " << gst_caps_to_string(caps);
    }
   
    gst_app_sink_set_caps(GST_APP_SINK(gstSink), caps);
    gst_caps_unref(caps);

    //g_object_set (G_OBJECT(gstPipeline),"video-sink",gstSink,(void*)NULL);
    gst_bin_add(GST_BIN(gstPipeline), gstSink);
    GstElement* decbin = gst_bin_get_by_name(GST_BIN(gstPipeline),"decode");
    gst_element_link (decbin, gstSink);
    gst_object_unref(decbin);

    return videoUtils.setPipelineWithSink(gstPipeline,gstSink,bIsStream);
}

Ow btw, I have an example app here:

you don’t need to modify the class itself to be able to use custom pipelines. you can just use ofGstVideoUtils::setCustomPipeline(...) to pass a pipeline string.

also if you are using specifically udp rtp sources you might want to take a look at https://github.com/aruroc/ofxGstRTP since it already implements lots of features and avoids the most common problems like asking for a new key frame when the stream begins…

I’m not aware of this method nor could I find it in OF. Do you perhaps mean: ofGstUtils::setPipelineWithSink(string pipeline, string sinkname, bool isStream)?

I was thinking to use a glimagesink and use the client-draw-callback which provides the GLTexture. This would be way more efficient as it prevents a download from the GPU and back. (in case of GPU decoding) However I haven’t had any success yet since I need to share the GLContext someway.

Thanks for the pointer to ofxGstRtp, some of it’s features are similar to ZOCP, a project I’m working on.

Notice setPipeline it’s a function from the ofGstVideoUtils class, not ofGstUtils.
ofGstVideoPlayer uses a ofGstVideoUtils object internally, you can access it using ofGstVideoPlayer::getGstVideoUtils().

yes, also setPipeline automatically creates an appsink so if you want to use a glimagesink you have to use any of the setPipelineWithSink methods in ofGstUtil.

also glimagesink will always create it’s own window as far as i understand. in linux with gstreamer 1.4.3 you can pass it another context using:

    glXMakeCurrent (ofGetX11Display(), None, 0);
    glDisplay = (GstGLDisplay *)gst_gl_display_x11_new_with_display(ofGetX11Display());
    glContext = gst_gl_context_new_wrapped (glDisplay, (guintptr) ofGetGLXContext(),
                  GST_GL_PLATFORM_GLX, GST_GL_API_OPENGL);

    g_object_set (G_OBJECT (glsink), "other-context", glContext, NULL);
    glXMakeCurrent (ofGetX11Display(), ofGetX11Window(), ofGetGLXContext());

but that chages in gstreamer 1.4.5. in any case using glimagesink you’ll always get an extra window.

i’ve been testing uploading to gl directly from gstreamer for a while and recently managed to create an OF pipeline that does ir but still cannot get accelerated decoding through VAAPI and then uploading to opengl. check this other thread: VideoPlayback with Gstreamer and vaapi

Ah thanks for the pointers. I assume setPipeline will automatically append the final sink. I’m trying now with a rtp h264 pipeline, like this:

ofGstVideoPlayer* player = new ofGstVideoPlayer();
fingerMovie.setPlayer(ofPtr<ofGstVideoPlayer>(player));
ofGstVideoUtils* vutil = player->getGstVideoUtils();
vutil->setPipeline("udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! decodebin ! autovideoconvert", OF_PIXELS_NATIVE, true);
fingerMovie.play();

But it doesn’t want to link the final sink. GST_DEBUG says:

0:00:00.024402489  7106      0x1536c00 INFO            GST_PIPELINE ./grammar.y:570:gst_parse_perform_link: linking decodebin0:(any) to ofappsink:(any) (0/0) with caps "(NULL)"
0:00:00.024422546  7106      0x1536c00 INFO        GST_ELEMENT_PADS gstutils.c:1545:gst_element_link_pads_full: trying to link element decodebin0:(any) to element ofappsink:(any)
0:00:00.024452016  7106      0x1536c00 INFO        GST_ELEMENT_PADS gstelement.c:892:gst_element_get_static_pad: no such pad 'src_%u' in element "decodebin0"
0:00:00.024483096  7106      0x1536c00 INFO        GST_ELEMENT_PADS gstutils.c:1125:gst_element_get_compatible_pad:<decodebin0> Could not find a compatible pad to link to ofappsink:sink
0:00:00.024499113  7106      0x1536c00 INFO                 default gstutils.c:1891:gst_element_link_pads_filtered: Could not link pads: decodebin0:(null) - ofappsink:(null)
[notice ] ofGstUtils: setPipelineWithSink(): gstreamer pipeline: udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! decodebin ! appsink name=ofappsink caps="video/x-raw,format={RGBA,BGRA,RGB,BGR,RGB16,GRAY8,YV12,I420,NV12,NV21,YUY2}"
0:00:00.024569657  7106      0x1536c00 INFO           GST_PARENTAGE gstbin.c:4148:gst_bin_get_by_name: [pipeline0]: looking up child element ofappsink

you shouldn’t use an ofGstVideoPlayer but an ofGstVideoUtils. ofGstVideoPlayer tries to do allocation and setup specific to the player pipeline. then use an external texture to upload the pixels on each new frame. also in the error message you post there doesn’t seem to be any videoconvert so it might be the cause for the error

I just noticed, I completely removed the ofGstVideoPlayer() and just used ofGstVideoUtils now:

gst.setPipeline("videotestsrc", OF_PIXELS_NATIVE, true);
gst.play();

This works however on update:

[warning] ofGstVideoUtils: update(): ofGstVideoUtils not loaded

I don’t see where the pipeline gets set to playing?

yeah you need to call startPipeline after setPipeline. it’s kind of confusing but it gives more control in cases where you need to do some extra allocation or setup between creating the pipeline and setting it to the paused state

Yep, it’s working now. Had to change the pixelformat to OF_PIXELS_RGB.

Now doing this with vaapi would be great. Regarding the GL context thing. I was under the impression that the glimagesink client-draw calls could be used in a GL context sharing setup. In 1.4 you could pass the ‘other-context’ parameter which gave me the impression you could give it an existing GL context (Like the one from OF) so when it would draw it draws in the OF window. The client-draw call receives the gl tex id, width and height. In Gstreamer 1.5 this is changed. The client-draw call now also receives a gl-context. Don’t know how one would deal with that yet.

from what i’ve tested the other-context parameter only sets a gl context to share resources with but not another window. it’ll open a new window but textures and other resources are shared between the context that gst opens and the one you pass as other-context

Hey Arturo,

I noticed you committed these new pixel formats (i.e. I420) but how do you use them? I understand you need a programmable renderer. However setting this up has changed, i.e. it’s now:

int main( ){
    ofGLWindowSettings settings;
    settings.width = 1280;
    settings.height = 720;
    settings.setGLVersion(4, 4);
    ofCreateWindow(settings);
    ofRunApp( new ofApp());
}

I assume this gives me a programmable renderer. (ofIsGLProgrammableRenderer() confirms)

So I create a Gst pipeline without a videoconvert:

gst.setPipeline("udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! avdec_h264", OF_PIXELS_NATIVE, true);
gst.startPipeline();
gst.play();

But how to draw it now? I was thinking:

gst.getTexture()->draw(20,20);

But that returns NULL. Copying pixel data also faills.
Guess OF_USE_GST_GL should be set during compilation of OF? But this seems to conflict GLES

In file included from /usr/include/gstreamer-1.0/gst/gl/gstglapi.h:62:0,
                 from /usr/include/gstreamer-1.0/gst/gl/gstgl_fwd.h:26,
                 from /usr/include/gstreamer-1.0/gst/gl/gl.h:29,
                 from ../../../libs/openFrameworks/video/ofGstUtils.h:22,
                 from ../../../libs/openFrameworks/video/ofGstVideoPlayer.h:3,
                 from ../../../libs/openFrameworks/video/ofVideoPlayer.h:9,
                 from ../../../libs/openFrameworks/video/ofVideoPlayer.cpp:1:
/usr/include/GLES2/gl2ext.h:73:15: error: expected ‘)’ before ‘*’ token
 typedef void (GL_APIENTRYP PFNGLBLENDBARRIERKHRPROC) (void);
               ^
/usr/include/GLES2/gl2ext.h:75:1: error: ‘GL_APICALL’ does not name a type
 GL_APICALL void GL_APIENTRY glBlendBarrierKHR (void);

K, I got around the compiler errors by changing /usr/include/gstreamer-1.0/gst/gl/gstglconfig.h to not have GLES2 PLATFORM_EGL GLEGLIMAGEOES and uncommenting #include <gst/gl/egl/gstgldisplay_egl.h> in ofGstUtils.cpp

The project config.make needs pkg-config --libs --cflags gstreamer-gl-1.0

It compiles, runs but no image yet. Using gst 1.4.3 (ubuntu) btw.

have you uncommented //#define OF_USE_GST_GL in ofGstUtils.h? that it’s really experimental and won’t work unless you modify the headers in the system which seem to be wrong

Hi, trying your code from github I try to stream from an other of app with a pipeline with a udpsink

I get : ofGstVideoUtils: preroll_cb(): received a preroll without allocation

and just can’t figure it out.

could you point me in the right direction.

E

oups forgot to add with and height to setPipeline… nervermind