Video streaming between two installations

Hi,
I was wondering if any of you people have had success streaming video between mac and windows (xp or vista) machines?

Here in the forum it looks like video streaming has only been successful with gstreamer, which looks like it will not work on windows so unfortunately not an option.

I am working on a project that includes two interactive installations sending live video feed and events to each other, possibly over the internet but would be happy just to make it work over LAN. For starters I’m trying to get the video streaming to work (much like a video conferencing app).

I have tried sending video from a webcam between two openframeworks apps by modifying the TCP and UDP examples from the ofxNetwork addon. TCP runs acceptable on very low res (100 x 75), would have to implement throttling for bigger images and test it. Although UDP is more suited for my installation since speed can become an issue.

Oddly I can’t get UDP to work with the same resolution (100 x 75), for some reason I can’t receive UDP messages of a length greater than something around 7500, the length needed is 100 x 75 x 3 = 22500 bytes (and much more when stepping up to bigger resolutions). Any ideas of why UDP won’t work? Here is my UDP Client and Server code:

Client:

  
  
void testApp::update(){  
	camGrabber.grabFrame();  
	if (camGrabber.isFrameNew()){  
		const char* pixels = (const char*)camGrabber.getPixels();  
		connection.Send(pixels, /*camGrabber.getHeight() * camGrabber.getWidth() * 3*/ 7500);  
		img.setFromPixels((unsigned char*)pixels, camGrabber.getWidth(), camGrabber.getHeight());  
	}  
}  
  

Server:

  
  
void testApp::update(){  
	char udpMessage[CAM_WIDTH * CAM_HEIGHT * 3];  
	connection.Receive(udpMessage, CAM_WIDTH * CAM_HEIGHT * 3);  
	receivedImg.setFromPixels((unsigned char*)udpMessage, CAM_WIDTH, CAM_HEIGHT);  
}  
  

The server is not receiving anything if the client sends a message longer than something around 7500.

Also I would like to know how you people would solve this problem (video streaming/conferencing) in openframeworks? I’m open to all suggestions and do not mind exploring additional addons or libraries.

1 Like

hi
can you please attach the example of the working program because i stop in the same problem.

thank you

I can post the code I have, but as I mentioned nothing is working. You want to look at the code that only sends very low resolution video over TCP? UDP is not working, it is very laggy and can only send a small part of the video. Or you would like to see both?

hi,

yeah, udp packets have quite a small maximum size, it’s really up to the individual network components/computer system what gets allowed through but generally you want to keep each packet <4k-6k.

for video streaming: your principle is sound, you just need a way of breaking up large data packets into smaller sections, numbering them properly, and rearranging them on the receive end yourself back to the large data packets to be drawn; you also need to deal with dropped packets (unlike TCP, UDP doesn’t guarantee transmission, so you have to be able to handle incomplete data). the trickiest part could be plugging in the video compression code so you can get your frames down small enough. if bandwidth isn’t a problem, try doing in-memory jpeg compression (eg http://forum.openframeworks.cc/t/jpeg-from-memory-to-ofimage–oftexture/3462/0; or make a ram drive and ‘save’/‘load’ to/from that) then stream the jpegs one by one for a kind of DIY motion-JPEG codec.

or, perhaps you could look at integrating VLC source somehow. VLC was originally intended to be a networked video player like you need…

good luck!
d

I got a low-res one working pretty nicely… like Damian mentioned, UDP packets are meant to be very small, and I really only needed the low-res one.

Another optimization I took was degrading it to black and white.

Yet another optimization that could possibly be taken is sending index numbers for a limited palette. So if you had a palette of 255 colours, you would only need to send 1 byte per pixel - same as with a black and white image, except you get a colour image (although in a 255 colour palette). Or for 2 bytes/pixel (still less than full RGB), you can get 65,536 colours.

Beyond that, I would say to take Damian’s suggestion about breaking each frame into a few packets, which of course introduces challenges of potentially dropped packets, since you would need to make sure you have the entire frame - and know which frame you’re on, etc. UDP is meant to be for sending data quickly, but has no built-in data integrity checking, so there is a strong chance of dropped packets… Orrrr just using one of them fancy video streaming libraries :stuck_out_tongue:

I guess it all depends what kind of quality you are after and how much work you’re ready to put into it. Probably getting one of those video streaming libraries working is the best choice though.

I got this to work. Im streaming a image from one computer to another on local network. I start by jpeg compressing the pixelarray i want to send, then i split it up in packets with maxsize of 8000 bytes (just a wild guess).
The packets are all starting with a imageID (a framecount, so the receiver knows when its starting on a new image), a packet number, total amount of bytes, and then all the bytes. In total 9 bytes as header. The udp packages are send with a little delay after each other (3 ms), otherwise they are deleted.

The receiver then makes a buffer it puts all the images in with help from the packetnumber and total number of bytes. When it got all the packages it creates the ofImage, and sends a go back to the sender that it can send a new image.

The framerate is pretty good when sending a 640x480 image over wifi, and there is almost no latency!

The code im using is written in objective-c, so it wont work out of the box in OF, but i can share it if you want.

hey, if you use udp you’re going to need to send id’s per frame, packet, need to take care of non-receive/sent packages and control the speed of sending (as halfdanj says you need to wait 3ms)

all this is already implemented in tcp, so unless you do something super optimized is better to directly use tcp and get rid of some problems. if you use the ofxTCPServer/Client be careful to use sendRawBytes/receiveRawBytes instead of send/receive since these are meant to be used to send simple string messages.

hey guys, thanks for the feedback.

right now I trying to get ffmpeg to work with xcode, after that I will try to get it to work with openframeworks and xcode. The idea is that ffmpeg then can handle compression and streaming, although I have linking problems in xcode, but don’t think this is the right forum for that issue.

Anyone had any experience compiling ffmpeg with openframworks on xcode?

I’ll give ffmpeg a couple of days more and if I can’t get it to work I will just code a TCP client server setup for the streaming and skip compression the first time around. The problem is that I would like as good quality as possible, am heading for 800x600.

If I get anything to work I will post the solution here.

okay I just can’t get ffmpeg to work with openframeworks. The reason is that ffmpeg will only compile for x86_64 architecture on snow leopard (10.6.2) and openframeworks for mac is 32 bit, ie i386 architecture.

If anyone have success with ffmpeg I would really like to know how they did it.

I’ll just code my own streaming classes then, following your suggestions.

I have been trying to stream video from VLC on one computer to an OpenFrameworks application on the other side, and treat the stream as a ofVideoPlayer object by using the HTTP protocol and doing something like: loadMovie("http://18.133.7.81:8080");

This was the quick and dirty approach, without any luck. This thread has kind of a sad ending. Did anyone find a solution that might work for Mac OS 10.6 - I’d love to do some installations with a remote camera from multiple locations.

I managed to send video frames between two openframeworks apps using the ofxNetwork addon and the ofxTCPServer and ofxTCPClient classes.

I sat server and client to blocking and put each in a thread (Poco::Activity) and then used read-write locks to update and read the frames sent and received (Poco::RWLock). This way I have sort of two wrapper classes for ofxTCPServer and ofxTCPClient, which I called FrameSender and FrameReceiver.

Then for each frame being side I implemented throttling, both on server and client side, because the image frames are too big to be sent in a single packet.

I haven’t had time to try to implement some compression, which is much needed if you want to send resolutions bigger than 160x120 over wireless LAN which is what I need.

Server sending function:

  
  
void FrameSender::runActivity() {  
	while (!activity.isStopped()) {  
		  
		//only the first connected client will be considered: id=0  
		if (server.isClientConnected(0)) {  
			if (rwlock.tryReadLock()) {  
				  
				//Send frame by throttling, ie splitting frame up into chunks (rows) and sending them sequentially  
				const char* index = (const char*)bufferFrame.getPixels(); //start at beginning of pixel array  
				int length = frameWidth * 3; //length of one row of pixels in the image  
				int pixelCount = 0;  
				while (pixelCount < frameSize) {  
					server.sendRawBytes(0, index, length); //send the first row of the image  
					index += length; //increase pointer so that it points to the next image row  
					pixelCount += length; //increase pixel count by one row  
				}  
				  
				rwlock.unlock();  
			}  
		}  
		  
		ofSleepMillis(100);  
	}  
}  
  

Client receiving function:

  
  
void FrameReceiver::runActivity() {  
	while (!activity.isStopped()) {  
		if (client.isConnected()) {  
			if (rwlock.tryWriteLock()) {  
				  
				//Receive a row at the time and write it into the memory allocated for the frame  
				//Receiving loop that must ensure a frame is received as a whole  
				char* receivePos = pixels;  
				int length = frameWidth * 3;  
				int totalReceivedBytes = 0;  
				while (totalReceivedBytes < frameSize) {  
					int receivedBytes = client.receiveRawBytes(receivePos, length); //returns received bytes  
					totalReceivedBytes += receivedBytes;  
					receivePos += receivedBytes;  
				}  
				  
				rwlock.unlock();  
			}  
		}  
  
		ofSleepMillis(50);  
	}  
}  
  

For the receiver it is important that the pixels array gets allocated properly:

  
  
this->frameSize = frameWidth * frameHeight * 3; //Size of image frame  
	pixels = new char[frameSize];  
  

Then the received char array gets turned into an ofxCvColorImage like this:

void FrameReceiver::readFrame(ofxCvColorImage* frame) {
if (frame->getWidth() < frameWidth || frame->getHeight() < frameHeight) {
throw length_error(“FrameReceiver::readFrame: Frame size is bigger than the size of the buffer.”);
}

if (rwlock.tryReadLock()) {
frame->setFromPixels((unsigned char*)pixels, frameWidth, frameHeight);
rwlock.unlock();
}
}

Updating the frame on that the server has to send is done in the following function:

void FrameSender::updateFrame(ofxCvColorImage* frame) throw (length_error) {
if (frame->getWidth() > frameWidth || frame->getHeight() > frameHeight) {
throw length_error(“FrameSender::updateFrame: Frame size is bigger than the size of the buffer.”);
}

if (rwlock.tryWriteLock()) {
bufferFrame.setFromPixels(frame->getPixels(), frame->getWidth(), frame->getHeight());
rwlock.unlock();
}
}

This works locally and over net between two macs running os x 10.6.3. There is a weird issue when sending between mac and windows, in this case windows has to run the server and the mac run the client, otherwise the clients messes up the received frames. I haven’t had time to dig into why this could be.

If any of you guys have ideas, tips or hints to how I could implement compression it would be much appreciated.

Though I still wish ffmpeg would work with openframeworks…

Hi rocknrolldk.

This looks interesting.
Do you think it is possible to get your example files?

I tried including it in my code but got a bunch of errors.

Thanks,
Stephan.

Hey Stephan,

the classes I wrote were part of a bigger project and you would need two networked computers with each a camera to run the whole thing.

I can send you the code if you want to have a look at it, but that will be after the 29th december as I’m not home until then and I have the project saved on an external disk.

Tell me if you’re interested then I’ll post a link once I get home.

Cheers.

hey rocknrolldk.

yeah some code would be nice .

thanks and,
happy holidays.

http://dl.dropbox.com/u/565607/VideoStreamClient.zip

http://dl.dropbox.com/u/565607/VideoStreamClient2.zip

here you go. These are the xcode projects I was working on. You have to run the two projects at the same time and they will send video frames to each other. For some reason only the first executed of the two will receive video, while the other will only send. If you run each project on it’s own machine and have the two machines connected on a LAN (remember to set IP addresses in testApp.h) they should both send and receive video. I’m writing should as I am not able to test it at the moment.

you might need to change the video path in testApp::setup(), and sorry about the Beavis and Butthead testing video :smiley:

Hope my code is usefull somehow, feel free to write me if you have any issues. Would like to hear if you get video streaming to work.

Cheers

Hi chrish.

thanks for the code.

i changed the resolution to 640 x 480 and it works fine when running only on my laptop.
when running receiver and sender on two different machines the received video frames are jumbled up. my guess is that old and new frame data are mixed together.

any ideas what i could changed? maybe decrease the fps send by the sender?

thanks again,
stephan.

hi, is there a chance to get the sourcecode for the example up again? can’t find a way to convert the raw rgb data into jpeg data to send that over the network. thanks!

I’ve had more requests about the videostreaming code so I made a permanent link, if anyone else should need it.

http://www.chrishjorth.com/files/ofvideostream.zip

Hi stephan, I had the same issues and never really had time to fix them. I got it to work with a mac and a windows machine, but with all the other machines I tried with, the behavior was weird. Have you found a solution?

I didn’t use compression in my code, I’m guessing any jpeg compression library could do the trick if you can split the compressed image into chunks. Would like to hear if you find a good solution to add compression.

rocknrolldk, thanks for the updates and permanently located code. I got your example up and running quickly, and was able to stream video between 2 apps running on the same machine.

I’m wondering how to take this further, streaming across computers on different networks. Basically, setting up a streaming server (from webcam) that can be viewed from any location. There are many similar webcams that stream online, and also VLC, etc… Any thoughts as to how to get this working?

Thanks in advance,
-Matt