Live video streaming to the web from OF

Hi OF!

Last couple of weeks I’ve been working on a solution to get a live video stream
from an openframeworks application to the web. The idea was to grab a bunch of
pixels and send them to a Flash Media Server (FMS). I’ve used rtmpd as a test server
and used libraries like ffmpeg/libav/gstreamer for encoding and muxing the video.

Streaming video to the web works quite well. I’ve done a 48 hour test which
streamed my webcam to the web (no memory leaks, low cpu, just 8mb memory).

In short I do this: I grab pixels from the application, convert them to YUV420 (using
libswscale, part of libav), encode them with H264 and mux this into FLV. The generated
FLV is then send over what RTMPD calls “inboundflv”. This is just a TCP stream of FLV data.

Next step was adding sound. I had to do way more tests and I had to read up on sound
as I haven’t done anything with digital sound until now. Damian gave me a quick intro in
the world of digital audio and it seems to be like the same thing as working with pixels :wink:

Somehow I couldn’t get the correct settings for the sound which is passed into audioIn().
I was asking for 2 channels, 44100 samples but when I encoded these samples the sound
had a very low pitch. Finally I got it to work by asking only 1 channel from ofSoundStream
and telling the library (or the avconv util, see www.avconv.org) that I was using 2 channels.
So now that I had sound with a normal pitch I had to find a way to get them in sync. After
spending so many hours getting into encoding and tinto hese libraries I didn’t have time for another
adventure to get the sync working. It seems that learning/getting into these libraries and all
their peculiarities takes more time then writing things yourself. Basically all what is done, are
three things: encoding, muxing and input/output (streaming over tcp for example).


Update 1, 2012.08.08
So I’m at this point now and I’m thinking about creating a simple library which can create a
live stream to the web from raw data (RGB24, PCM audio). I want to use 100% open technologies.
So no flash, no h264. This means that either using ogg or webm is the way to go. I’ve got a
pretty clear idea how to make this work. I want to create this simple library which stands on it’s
own and only uses what is necessary to create a video stream with audio to the web. This means
no huge libraries, no wrappers like ffmpeg, gstreamer, libav, vlc etc… All what’s needed is libvpx,
vorbis.

Update 2, 2012.09.09
I’m still looking into this and writing a couple of test applications which take care of little parts
of what is needed to create this streaming library. At this point I’ve got a working example which
encoders raw RGB24 frames with libvpx and muxes it into a .webm video stream. This stream is
then send to an icecast server which makes the stream available in a web page. All code I wrote
for this is still part of my research to find out how to write a streaming video lib. Next part will be
looking into audio encoding with http://www.theora.org/downloads/, http://www.opus-codec.org/, mp3 and speex.

Update 3, 2012.10.19
Just wanted to mention that I’ve got a proof of concept working which is only using 2 libraries: x264 + libspeex. I’m muxing encoded data into flv and send that to a flash media server. Video + audio is nicely in sync! It needs a bit more work and testing.

I’m know people here are interested in something like this. If you’re interested in this and
want to help you’re more then welcome :slight_smile: Please reply on this post if you want to make this work
and/or have ideas about how to approach this.

Greetz
roxlu

2 Likes

I’m not exactly sure what you’re trying to accomplish. Are you trying to stream what your application displays or are you trying to stream some videostream through OF. I would have hacked my way through just using plain ffmpeg for doing that. However gstreamer should be able to do that as well and would be the way to go IMHO, ie. http://blog.abourget.net/2009/6/14/gstreamer-rtp-and-live-streaming/

Rg,

Arnaud

@sphaero I’ve updated what my goals are. hope it’s more clear now.

Definitely more clear and yes I’m interested in this too. Although I’m quite happy using gstreamer and sorts. But creating a lean and mean lib for doing this +1

Would help if I have some time. But I first want to finish my blender importer.

Hey roxlu,

this sounds really interesting and super useful. happy to lend a hand w/ anything I can.

Does the non-open / flash version of your streamer code exist online anywhere? – it definitely would make a great addon and I’d love to give a try.

  • zach

Hey Zach,

You can see the code here:

https://github.com/hellicarandlewis/caravideo

I agree I think this would be super useful. Imagine all the brilliant broadcasts we could make - and feedback loops between different spaces. The chaps at Influxis have been very generous with server space - so if you’d like some bandwidth to test with I am sure I could arrange it.

Cheers,

Joel

Hey Zach,

I’ve been working on two different ways to stream generated RGB frames to a Flash Media Server (or alike: wowza, red5 etc…). Streaming video images (with an gstreaming appsrc, or just libav) works great and is not much of code. But after spending many many hours getting the audio in sync and going through libav and gstreamer it’s hard to debug w/o completely going through all libav/gstreamer code.

What I want is a complete open solution with only the necessary libraries to do streaming. This means no x264 as it has license issues. I’ve been on holiday last weeks, but before I went to the sunny beaches of Spain I spoke with someone who created a html5/webm audio/video stream (also a library like gstreamer/libav) which is the way I want to go too.

These are the steps which I’m working on:

  • transform RGB into YUV using libyuv/libswscale (done)
  • encode YUV with libvpx/vp8 (80% done)
  • encode audio with libvorbis (need to implement this)
  • mux video+audio into webm/matrosko (70% done)
  • implement I/O code (basic sockets first I think)

Best
roxlu

WebM doesn’t work with iOS as far as I can tell. Is there one standard to rule them all?

J

@JGL for mobile devices we need h264/mp4. iPhone/iPad uses pseudo streaming. We need to create a library which can switch between two thing: h264/mp4/mpeg-ts,http stream for mobile devices and libvpx/webm/http for pc’s.

@roxlu, I’m looking for exactly this. In case I want to use it under Windows, do you have any working version available?
Thanks!!

@chuckleplant I’ve got a streaming library that we used a couple of months ago. It was running on mac but I’m sure it will run on Windows too, it only needs some work. To stream the library was kind of tweaked for our project with things like multi streams and per stream quality. You can see the code here https://github.com/roxlu/video_streamer

1 Like

Hi, wow, nice! I am just looking for a way to do this too. :slight_smile:

I would like to get the video from the webcam and (after overlaying some text or data) to broadcast the signal to a streaming server, like Youtube Live Stream or something like that. In that case I don’t need audio.

There are any news about this?

just this addon?: https://github.com/roxlu/video_streamer ? I’ll check it

EDIT: is this addon for Windows only? or I am missing something?

sorry for asking again, but I can’t understand if there is some addon to live stream from OF to the web, like Youtube or others. Anyone knows?

1 Like

Is it possible to stream a video from the web using ofxiOSVideoPlayer?