How to stream video from one device to another?

Hi would like to stream video from the iPad to my iPhone, but I have no idea what its require to do such thing.

Any pointers will be much appreciated, thanks

I think somewhere in the forum there is an addon that claims to be able to do this thing… (stream a video)

I once naively tried to “stream” video via ofxosc… (by sending as an unsigned char* each frame) but … obviously ofxosc didn’t allow it because of size limitations… it didn’t send the message.

you could try sending the images as a raw bitstream. An example of a tcpip connection client/server is in this addon: https://github.com/patriciogonzalezvivo/ofxKinectStreamer

Using the same method, you could also send a compressed stream eg jpg (use turbo-jpg) addon which would (i suppose) reduce the latency and bandwidth requirements.

cheers,
kj

Hi

You can try with this addon “ofxremotecamera” it implements turbo-jpg library to compress the frames. The addon of Patricio is bassed on that, but removing the compression to preserve the kinect depthmap.

[spanish]
Cuando estuvimos mirando “ofxremotecamera” creo recordar que este user había hecho un fork para usarlo precisamente en ios https://github.com/armadillu/ofxRemoteCamera
[/spanish]

Hope that helps

Thanks @pandereto I try the addon on iOD but I get a few errors

any ideas how to solve it?

Hi nardove.
Try to include the “libturbojpeg.a” in your project.

Just did and got the same errors, I drag the file to the target linked binary libraries, any ideas?

Hey Nardove,

I’ll throw in my 2 cents, as i’ve spent a lot of time digging through black holes of despair trying to figure out P2P live camera streaming for ios, albeit without much success…

For the ofxRemoteCamera example, I had to recompile the libturbojpeg libraries for ios. Here’s a basic example with the compiled libs- http://trentbrooks.com/files/iosRemoteCameraClient.zip. This example show how to receive video, so you may need to adjust for sending. It works- requesting a small 120x90 greyscale image with NO compression, but once you set the compression level (touch screen to decrease) all the pixels get messed up - something to do with libturbojpeg.

There’s also a few examples on the forums here that work on the principle of sending rows of pixels in a threaded blocking way. I assembled bits and pieces together here- http://trentbrooks.com/files/TCPPixels.zip, but still could not get perfectly synced video, sometimes they come through jumbled and you need to throttle the speeds depending on the connection speed- see the TCPPixels.zip if you wanna have a look.

I think the main issue with sending video pixels is getting a decent framerate + size which requires a compression/decompression library that works on ios. If you can figure this part out i’d love to know.

I ended up going with a 3rd party solution in the end- http://www.tokbox.com/opentok/api. Which is free for single P2P with the WebRTC branch.

Good luck,
Trent

I was wondering if anyone of you has had some progress on this?

Hi guys

I’ve been working on a couple of different streaming solutions lately. I did some tests with encoding h264, vpx, theora, vorbis, mp3 and muxing them into flv, webm or ogg. I want to create a couple of easy wrappers to stream audio/video to the web or to another applications. I’ve been streaming FLV to the web through a (open source) Flash Media Server.

Streaming video from one app to another is quite doable with both libx264 (patented) or libvpx (open source, not patented from Google). I created a test client<>server which streams just video from one app to another.

The code is in my (ever changing) repository on github: https://github.com/roxlu/roxlu/tree/master/apps/examples/vpx-client
https://github.com/roxlu/roxlu/tree/master/apps/examples/vpx-server

You can build/run both examples on Mac OSX 10.8 (I need to update the libraries
for linux/windows), by executing these two shell scripts (for both apps)

$ apps/examples/vpx_client/build/cmake/build_release.sh
$ apps/examples/vpx_client/build/cmake/run_release.sh

$ apps/examples/vpx_server/build/cmake/build_release.sh
$ apps/examples/vpx_server/build/cmake/run_release.sh

The FLV addons (which I’m using in production, so it’s stable) can be found here:
https://github.com/roxlu/roxlu/tree/master/addons/FLV

Best
roxlu

Hey Roxlu,

I spent a couple of hours trying to compile your examples, but cmake errors just kept piling up so I gave up in the end. Debugging shell scripts is pretty rough. I am wondering though…

1/ What framerate and image dimensions are achievable with your examples?
2/ Would your examples work on ios, and if so how much of your own personal repo would we have to include when creating a new OF project?

I have also tested out a couple of other implementations of video/image streaming from the ofxaddons site- https://github.com/bakercp/ofxIpVideoServer & https://github.com/toyoshim/ofxRemoteKinect (just the rgb part). They both compress images using ofSaveImage(ofPixels &pix, ofBuffer &buffer) before sending. This method works pretty well for video streaming between 2 ios devices on wifi- I get a decent 30fps, but only up to 320x240. Am definitely interested in what kind of results your getting?

Trent

Hi Trentbrooks, lets do a skype chat. I tested it on 3 different mac last week and it runs fine.

do you think you could post a version that already was cmade ? for us who don’t know much about cmake.

thx.

Hi,

do you think that porting will be easy to windows systems?

thanks

There many methods to transfer video, music, photos from iPhone to your iPad, you can use the wifi to tranfer them, but method is too slow, whne you have a large of files, it will waste a lot of your time, as for me, I use iPHone to computer transfer app to copy files from iPhone to computer firstly, and then sync to iPad with iTunes, here is the resources: How-to-transfer-music-from-iPhone-4s/5/to-Mac, if your’re Mac user.

So I started with trentbrooks’s work and tried to finish the compression… However, for the life of me, I was unable to make libturbojpeg work for iOS. I found that there was another compression library already included called zlib. I switched to trying to use that and have managed to get it to the point that the apps no longer crash when I try to add compression (which is the only outcome I could achieve using libturbojpeg).

Has anyone else tried solving the problems or got it working? Or does anyone have any experience with zlib?

@tentbrooks
i have been looking at tokbox. it looks very interesting.
how did you integrate it with OF?
what’s you experience with it?

thanks, stephan.

@stephanschulz hey, i used one of the early webrtc sdk’s for tokbox and it was pretty easy to integrate into OF. But looking at their website now, they seem to be transitioning to a paid only service- with a $50 base monthly fee. I can’t see the free single p2p module either, so I don’t think I would recommend this service anymore.