Streaming audio and video between 2 locations

I have had a search and there are a few people talking about streaming with OF but I wanted to see if anyone has anything updated for streaming video.

I have a project that requires long distance video transmission. It is a bout 3km in a built up city area so so far I can only really go for sattelite shich is way too expensive. I have to run an SD stream for 8 days with no breaks so I need something pretty reliable. Resource wise I have access to good computers on either end and very very fast internet connection (supplied by a university campus at both ends). I need to send pretty high quality video and audio in sync as it is then rebroadcast via ustream streaming and over an analogue transmission simultaneously.

I would really like to achieve this from within an OF app, but if anyone has experience of doing this in any way I would love to know how.

Cheers

i’ve been working in a bunch of addons to do video, audio, depth and osc streaming in sync from OF:

https://github.com/arturoc/ofxGstRTP

it only works in osx and linux by now and in osx you’ll need to install gstreamer but it’s usually pretty straight forward, there’s some documentation in the readme file

also it allows to do NAT transversal if you use ofxNice and ofxXMPP which means that you can discover the other side using google talk or any other jabber server and you don’t need to open any ports in your routers

Wow thanks a lot. I am have everything going but I get a few errors depending on what branches I am using. I have switched to your “videoframe working for gstreamer” branch of openframeworks and both the release and develop branch of ofxGstRTP.

I still get 2 errors if you have any clues on how to solve it. in ofVideoPlayer I get an error on
(allocating an object of abstract class type ofQtkitplayer)
setPlayer( ofPtr<OF_VID_PLAYER_TYPE>(new OF_VID_PLAYER_TYPE) );

I dont get this error if I use the master branch.

Also with either branch of openframeworks I get an error in snappy-subs-internal.h I get an error on line 362

assert(value == 1);

use of undeclared identifier assert

I think I might be using the wrong combination of branches but not sure.

Cheers

you should be using the master branch of everything, the assert error comes from a file you need to delete in the gstreamer install, it’s explained in the readme

Crap sorry I did not scroll down that far, my bad.

Works a treat so far thanks

Fred

great, btw, i added recently some events to get a notification when a new call is received and some methods to accept it or refuse it, the only example that is using them right now is example-all so the rest won’t stablish the connection as they are.

mainly you need to listen for the newCallReceived event and call acceptCall() when you get the event. i’ll update the rest of the examples soon

I have everything working fine, but I dont get communication between the apps apart from the jabber/google part. I get a connection but no information is passed through. I dont have kinects so I made a video/audio version with the event notification but maybe I missed something

  
#include "testApp.h"  
#include "ofxGstRTPUtils.h"  
//#define USE_16BIT_DEPTH  
  
  
  
//--------------------------------------------------------------  
void testApp::setup(){  
	ofXml settings;  
	settings.load("settings.xml");  
	string server = settings.getValue("server");  
	string user = settings.getValue("user");  
	string pwd = settings.getValue("pwd");  
       
	grabber.initGrabber(640,480);  
	remoteVideo.allocate(640,480,GL_RGB);  
       
	rtp.setup(0);  
	rtp.getXMPP().setCapabilities("telekinect");  
	rtp.connectXMPP(server,user,pwd);  
	rtp.addSendVideoChannel(640,480,30,300);  
	rtp.addSendAudioChannel();  
	calling = -1;  
       
	ofBackground(255);  
  
	ofAddListener(rtp.callReceived,this,&testApp::onCallReceived);  
	ofAddListener(rtp.callFinished,this,&testApp::onCallFinished);  
	ofAddListener(rtp.callAccepted,this,&testApp::onCallAccepted);  
  
	callingState = Disconnected;  
  
	ring.loadSound("ring.wav",false);  
	lastRing = 0;  
}  
  
void testApp::onCallReceived(string & from){  
	callFrom = ofSplitString(from,"/")[0];  
	callingState = ReceivingCall;  
}  
  
void testApp::onCallAccepted(string & from){  
	if(callingState == Calling){  
		callingState = InCall;  
	}  
}  
  
void testApp::onCallFinished(ofxXMPPTerminateReason & reason){  
	if(callingState==Calling){  
		ofSystemAlertDialog("Call declined");  
	}  
	callingState = Disconnected;  
  
	/*rtp.setup(200);  
	rtp.addSendVideoChannel(640,480,30,300);  
	rtp.addSendDepthChannel(640,480,30,300);  
	rtp.addSendOscChannel();  
	rtp.addSendAudioChannel();*/  
}  
  
void testApp::exit(){  
}  
  
  
//--------------------------------------------------------------  
void testApp::update(){  
	grabber.update();  
	if(grabber.isFrameNew()){  
		rtp.getServer().newFrame(grabber.getPixelsRef());  
	}  
       
	rtp.getClient().update();  
	if(rtp.getClient().isFrameNewVideo()){  
		remoteVideo.loadData(rtp.getClient().getPixelsVideo());  
	}  
       
     if(callingState==ReceivingCall || callingState==Calling){  
		unsigned long long now = ofGetElapsedTimeMillis();  
		if(now - lastRing>2000){  
			lastRing = now;  
			ring.play();  
		}  
	}  
  
}  
  
//--------------------------------------------------------------  
void testApp::draw(){  
     const vector<ofxXMPPUser> & friends = rtp.getXMPP().getFriends();  
       
	for(size_t i=0;i<friends.size();i++){  
		ofSetColor(0);  
		if(calling==i){  
			if(rtp.getXMPP().getJingleState()==ofxXMPP::SessionAccepted){  
				ofSetColor(127);  
			}else{  
				ofSetColor(ofMap(sin(ofGetElapsedTimef()*2),-1,1,50,127));  
			}  
			ofRect(ofGetWidth()-300,calling*20+5,300,20);  
			ofSetColor(255);  
		}  
		ofDrawBitmapString(friends[i].userName,ofGetWidth()-250,20+20*i);  
		if(friends[i].show==ofxXMPPShowAvailable){  
			ofSetColor(ofColor::green);  
		}else{  
			ofSetColor(ofColor::orange);  
		}  
		ofCircle(ofGetWidth()-270,20+20*i-5,3);  
		//cout << friends[i].userName << endl;  
		for(size_t j=0;j<friends[i].capabilities.size();j++){  
			if(friends[i].capabilities[j]=="telekinect"){  
				ofNoFill();  
				ofCircle(ofGetWidth()-270,20+20*i-5,5);  
				ofFill();  
				break;  
			}  
		}  
	}  
       
	ofSetColor(255);  
	remoteVideo.draw(0,0);  
	grabber.draw(400,300,240,180);  
       
     if(callingState==ReceivingCall){  
		ofSetColor(30,30,30,170);  
		ofRect(0,0,ofGetWidth(),ofGetHeight());  
		ofSetColor(255,255,255);  
		ofRectangle popup(ofGetWidth()*.5-200,ofGetHeight()*.5-100,400,200);  
		ofRect(popup);  
		ofSetColor(0);  
		ofDrawBitmapString("Receiving call from " + callFrom,popup.x+30,popup.y+30);  
            
		ok.set(popup.x+popup.getWidth()*.25-50,popup.getCenter().y+20,100,30);  
		cancel.set(popup.x+popup.getWidth()*.75-50,popup.getCenter().y+20,100,30);  
		ofRect(ok);  
		ofRect(cancel);  
            
		ofSetColor(255);  
		ofDrawBitmapString("Ok",ok.x+30,ok.y+20);  
		ofDrawBitmapString("Decline",cancel.x+30,cancel.y+20);  
     }  
  
}  
  
//--------------------------------------------------------------  
void testApp::keyPressed(int key){  
  
}  
  
//--------------------------------------------------------------  
void testApp::keyReleased(int key){  
  
}  
  
//--------------------------------------------------------------  
void testApp::mouseMoved(int x, int y ){  
  
}  
  
//--------------------------------------------------------------  
void testApp::mouseDragged(int x, int y, int button){  
  
}  
  
//--------------------------------------------------------------  
void testApp::mousePressed(int x, int y, int button){  
     ofVec2f mouse(x,y);  
	if(callingState==Disconnected){  
		ofRectangle friendsRect(ofGetWidth()-300,0,300,rtp.getXMPP().getFriends().size()*20);  
		if(friendsRect.inside(mouse)){  
			calling = mouse.y/20;  
			rtp.call(rtp.getXMPP().getFriends()[calling]);  
			callingState = Calling;  
		}  
	}else if(callingState == ReceivingCall){  
		if(ok.inside(mouse)){  
			rtp.acceptCall();  
			callingState = InCall;  
		}else if(cancel.inside(mouse)){  
			rtp.refuseCall();  
			callingState = Disconnected;  
		}  
	}  
  
}  
  
//--------------------------------------------------------------  
void testApp::mouseReleased(int x, int y, int button){  
  
}  
  
//--------------------------------------------------------------  
void testApp::windowResized(int w, int h){  
  
}  
  
//--------------------------------------------------------------  
void testApp::gotMessage(ofMessage msg){  
  
}  
  
//--------------------------------------------------------------  
void testApp::dragEvent(ofDragInfo dragInfo){   
  
}  
  

the logic looks good, do you get the call on the other side?

also the h264 encoder doesn’t emit keyframes so sometimes you have to wave your hand really near the camera so it creates a keyframe and the other side can start to decode. try to check with audio only just in case

Ok, I had some success but have a few questions about your experience.

I have the OSC only working fine and only once I had video working (audio not but I can open a pipeline). I dont understand how to select an audio device and add the samples to the stream locally.

I get this far on the console

  
fredaudio@gmail.com/A39D60D6: has jingle, state: WaitingSessionAccept jingle_action: session-accept  
xmpp received content with mediavideo  
xmpp received content with mediaaudio  
session accepted setting remote candidates  
[notice ] video: creating candidate for component 1  
[notice ] video: creating candidate for component 1  
[notice ] video: creating candidate for component 2  
[notice ] video: creating candidate for component 2  
[notice ] video: creating candidate for component 3  
[notice ] video: creating candidate for component 3  
[notice ] video: adding remote candidates for component 1 0x252f2c8  
[notice ] video: state changed for component 1 connecting  
[notice ] video: adding remote candidates for component 2 0x252f2d8  
[notice ] video: state changed for component 2 connecting  
[notice ] video: adding remote candidates for component 3 0x3ebdc80  
[notice ] video: state changed for component 3 connecting  
[notice ] audio: creating candidate for component 1  
[notice ] audio: creating candidate for component 1  
[notice ] audio: creating candidate for component 2  
[notice ] audio: creating candidate for component 2  
[notice ] audio: creating candidate for component 3  
[notice ] audio: creating candidate for component 3  
[notice ] audio: adding remote candidates for component 1 0x3ebe860  
[notice ] audio: state changed for component 1 connecting  
[notice ] audio: adding remote candidates for component 2 0x3ebf708  
[notice ] audio: state changed for component 2 connecting  
[notice ] audio: adding remote candidates for component 3 0x3ebf718  
[notice ] audio: state changed for component 3 connecting  
conn DEBUG SENT: <iq sid="" type="result" to="fredaudio@gmail.com/3B67698A"/>  
fredaudio@gmail.com/A39D60D6 to state SessionAccepted  

But no amount of waving seems to reproduce the connection that I got once.

I have been testing using the same credentials on both machines and also different credentials on each but I dont seem to get a different results.

Let me know if there are any other tricks to get it going.

Cheers

for audio you don’t need to do anything, gstreamer will open the default microphone and grab audio.

i’m developing this addon for a peer to peer application so right now you need to have the same channels in both sides. are you running the same app in both computers?

Yes, I compiled the app and copied it to the other machine (Gstreamer installed on both), I tried using different users and passwords (both have each other in the contact list) and using the same credentials on both. They connect fine (I can even run 2 copies of the OSC only version on one machine and it works). I cannot get audio to work but I had video working briefly. Without changing anything I cannot get the video to work again, it took a while, maybe 30 seconds for the video stream to show and I could only get it in one direction.

I will try get rid of the bi directional part tomorrow and make one server and one client to see if I have better luck, currently I have pretty fast internet but not the best.

I would like to specify my audio input also so I will dig through and see if I can change it to let me do this.

Thanks for the help and the code, Ill keep messing around and see where I get.

Cheers

Just wanted to let you know I was trying to feed a video stream of different dimensions into the video pipeline. I have added dimension parameters to the call method.

ofxGstXMPPRTP::call(const ofxXMPPUser & user, int videoWidth, int videoHeight)

I get pretty good video and can even go to a very high bitrate and get low latency, this is pretty amazing!

I have trouble with audio, it seems to mess up my video stream completely and sometimes cause it to fail, this changes when I change my default audio input device.

Thanks again for the help- no need to reply I just wanted to add this information.

Cheers again

great that it’s working and yes i never added the possibility to change the default size since we’ve been working with kinects where the only possible resolution is 640x480. did you get it to work with different resolutions? the default is hardcoded in some places so you would need to change that too.

right now i’m adding some code for echo cancellation and noise supression so the code is a little bit messed up but will fix the resolution as soon as it’s more stable

about the audio messing the video stream, it’s probably related with the timestamping of the video frames not being the same as the audio ones i’ll let you know if i find something

Do you get a video stream in both directions. I am using the same code on both computers and it does not seem to matter which initiates the call the same machine always provides a video stream and the second only receives (before I messed with the code also). It does not matter if I use the same or different credentials at both ends.

yes i get full duplex stream without problem. you can use the same credentials in both computers without problem. actually because of how the network sockets work, if one of them is receiving it’s should be sending too

just to let you know that i’ve updated the addons, there’s several bugfixes including sending a keyframe at the beginning and also you can set now any resolution for video

Thanks so much, this is quite amazing for me, it adds so many possibilities.

I have one question, I am playing with a lan version with 4 bidirectional clients to and from a single server. It is super stable and solid (Although I used apple audiounits netsend and netreceive as there is too much latency in the audio- did you also get way out of sync audio over lan?).

When I start the app, even if i have saved the gui settings the default bitrate is the lowest, my gui slider is sitting where I saved it but I have to move it to get the settings to apply.

Thanks again

you mean that the video and audio are not in sync? i’ve done several tests and the current version should be in sync. if you mean just that there’s latency you can change that in the client side.

about the sliders, you need to load the settings after the pipeline has started or else it won’t be created yet and you’ll be applying the properties to a non existing object. i’ll look into fixing this so the value that you have in the parameters is applied even if it changes before the pipeline is created

Yes, over LAN my audio is quite late, but over the internet it seems better? I needed to matrix audio between 4 clients and a server so I used audiounits for this anyway.

Is there a way to know if a stream is dropped and restart it? At the moment if I drop a connection it will only restart in one direction, I have to restart both apps to get it going again.

Fred

well there’s really no connection since it’s udp, the server side is only sending packages to one address and the client is just listening on a port so even if some packages are lost it should autorecover at some point.

but yes right now you can’t create a new connection from the application i need to change the classes a bit so they allow to close and create a new call.