How to OF + WebRTC

Anybody have any idea where to get started with implementing a WebRTC video client in OF (for MacOS)?

I’m building an OF application that runs on a mac mini that needs to do some low latency video chatting via WebRTC.

There seems to be a lot of support for Android and iOS, see .

Anyone have experience doing this? this post: describes an integrated WebRTC approach by @globacore but has no description of how it was done…

this: can do video & audio in sync over rtp, do nat transversal using stun… which are the protocols that webrtc uses internally but it won’t be directly compatible with webrtc so if you need to communicate with other webrtc client it won’t work but if you just need to send video/audio between 2 OF apps it’ll do

interesting, thanks Arturo! We’re actually already using ofxGstreamer to playback RTSP streams…

the requirement for this one is to actually integrate with a server that’s managing/hosting/archiving (and potentially providing the signaling needed to setup a webRTC connection.) I am definitely considering a “roll your own” approach using gstreamer to decode WebRTC (if it’s even possible), but this seems it might be a massive rabbit hole.

Just a heads-up that Gstreamer 1.14+ now has the webrtc component. I did some work on ofxGStreamer to get it compiling w/ macOS via the brew tap installed GStreamer, or via manual builds of gstreamer components.


It should be possible with gstreamer 1.14+ to get webrtcsink pipeline via ofGSTUtils. When compiling the samples, the final pipeline looks like:

# via

"webrtcbin bundle-policy=max-bundle name=sendrecv  stun-server=stun:// videotestsrc is-live=true pattern=ball ! videoconvert ! queue ! vp8enc deadline=1 ! rtpvp8pay ! queue ! application/x-rtp,media=video,encoding-name=VP8,payload=96 ! sendrecv. audiotestsrc is-live=true wave=red-noise ! audioconvert ! audioresample ! queue ! opusenc ! rtpopuspay ! queue ! application/x-rtp,media=audio,encoding-name=OPUS,payload=97 ! sendrecv. "