networkUDP for video streaming?

Please take a look at my problems, I haven’t slept for two nights…

I am trying to communicate video data from a webcam to a server, through networkUDP example in ofxNetwork. I though I was doing the right thing but it doesn’t work. Could any of the network guru kindly help to check my code and give me some suggestions? Thank you very much ahead!

  1. in UDPsender, I am trying to grab video from webcam, store it in unsigned char *videoRGB, put all characters which the *videoRGB point to into a string for every frame, and finally send the string to Socket. The main code is as below:
  
  
//--------------------------------------------------------------  
void testApp::setup(){	   
	// we don't want to be running to fast  
	//ofSetVerticalSync(true);  
	//ofSetFrameRate(60);  
  
	camWidth 		= 320;	// try to grab at this size.   
	camHeight 		= 240;  
	  
	vidGrabber.setVerbose(true);  
	vidGrabber.initGrabber(camWidth,camHeight);	  
	videoRGB 	= new unsigned char[camWidth*camHeight*3];  
	videoTexture.allocate(camWidth,camHeight, GL_RGB);  
	  
	/////////////////network stuff//////////////////////  
        //create the socket and set to send to 127.0.0.1:11999  
	udpConnection.Create();  
	udpConnection.Connect("127.0.0.1",11999);  
	udpConnection.SetNonBlocking(true);  
}  
  
//--------------------------------------------------------------  
void testApp::update(){  
	  
	ofBackground(100,100,100);  
  
	vidGrabber.grabFrame();	  
	if (vidGrabber.isFrameNew()){  
		videoRGB = vidGrabber.getPixels();  
		videoTexture.loadData(videoRGB, camWidth,camHeight, GL_RGB);  
	}  
  
	/////////////////network stuff//////////////////////  
	message = "";  
	  
	if(videoRGB != NULL){  
		for(int i=0; i<camWidth*camHeight*3; i++){  
			message+=*(videoRGB+i);  
		}  
	}  
}  
  
//--------------------------------------------------------------  
void testApp::draw(){  
  
	ofSetColor(0xffffff);  
	videoTexture.draw(20,20,camWidth,camHeight);  
	videoTexture_testing.draw(camWidth+40,20,camWidth,camHeight);  
	  
        /////////////////network stuff//////////////////////  
	int sent = udpConnection.Send(message.c_str(),message.length());  
}  
  

  1. in UDP receiver, I receive the string sent out by the sender application, and assign each character back to array pointer (unsigned char *rgbVideo), and put the *rgbVideo inside an ofTexture, and try to render it on the screen, but it doesn’t work.

I am sure the translation between array pointers and strings doesn’t have anything wrong, as I’ve tested it out in a single application, and it works perfectly. So there might be something wrong with the way I am using UDP functions.

  
  
void testApp::setup(){  
	//we run at 60 fps!  
	ofSetVerticalSync(true);  
	ofSetFrameRate(60);  
  
	//load our type  
	mono.loadFont("type/mono.ttf", 9);  
	monosm.loadFont("type/mono.ttf", 8);  
  
        //create the socket and bind to port 11999  
	udpConnection.Create();  
	udpConnection.Bind(11999);  
	udpConnection.SetNonBlocking(true);  
  
	ofBackground(255, 255, 255);  
	ofSetBackgroundAuto(false);  
  
	//video  
	camWidth 		= 320;	// try to grab at this size.   
	camHeight 		= 240;  
	  
	videoRGB 	= new unsigned char[camWidth*camHeight*3];  
	videoTexture.allocate(camWidth,camHeight, GL_RGB);  
}  
  
//--------------------------------------------------------------  
void testApp::update(){  
	ofBackground(100,100,100);  
  
	char udpMessage[100000];  
	udpConnection.Receive(udpMessage,100000);  
	string message=udpMessage;  
	  
	if(message!=""){  
		  
		for(int i=0; i<camWidth*camHeight*3; i++){  
			*(videoRGB+i)= NULL;  
		}  
		for(int i=0; i<camWidth*camHeight*3; i++){  
			*(videoRGB+i)= message[i];  
		}  
	}  
	  
  
	videoTexture.loadData(videoRGB, camWidth,camHeight, GL_RGB);  
  
}  
//--------------------------------------------------------------  
void testApp::draw(){  
	ofSetColor(0xffffff);  
	videoTexture.draw(20,20,camWidth,camHeight);  
  
}  
  

Hi,

one mistake is:

  
  
char udpMessage[100000];    
    udpConnection.Receive(udpMessage,100000);    
  

the buffer is too small to hold one frame of 320 x 240 x 3 bytes. (one frame consists of 230400 bytes)

And a better description of “it doesn’t work” might help in narrowing down your problems.

cheers,
Stephan

Hey Stephan,
Thank you very much for taking a look at my code. Indeed I tried both udpMessage[230400] and udpMessage[1000000] before, It didn’t work either.

I will explain what “didn’t work means”: I tested the data translation in 2 ways:

  1. When I tested like this, the server can receive the message rightly, and print out “test”:
    Client side:
  
  
	//network stuff  
	message = "test";  
  

Server side

  
  
    //network stuff  
    if(message!=""){    
   cout << message << endl  
   }  
  

  1. When I tested the code like this (I added the part of assigning pixel characters to the message string), the server connections sometimes failed, sometimes had connection but the string length changed to 48, which is totally wrong.

I think the connection is still there, is this because of the way I am sending 1 million characters/bytes? Is there any good idea about how to translate data format for webcam video stream?

Client side:

  
  
	//network stuff  
	message = "test";  
	if(videoRGB != NULL){  
		for(int i=0; i<camWidth*camHeight*3; i++){  
			message+=*(videoRGB+i);  
		}  
		//cout << (int)*videoRGB << endl;  
	}  
  

Server side

  
  
    //network stuff  
    if(message!=""){    
   cout << "connected" << endl  
   cout << message.length()  << endl  
   }  
  

Hi,

I think you’ll have to split your data into smaller packets, as there are some size-limit for UDP-packets. (see http://en.wikipedia.org/wiki/User-Datagram-Protocol for more info)

HTH,
Stephan