Recieving a streaming video in OF via websockets

I have a JS application/server where I am currently grabbing my webcam and streaming the video to a node server via Binary.js as such:

            var client = new BinaryClient(settings.socketSrv);
            var stream;

            var imageFrame = receiverContext.getImageData(0, 0, settings.canvasWidth, settings.canvasHeight);
            var userMedia = Modernizr.prefixed('getUserMedia', navigator);

            senderEl.width = settings.canvasWidth;
            senderEl.height = settings.canvasHeight;

            receiverEl.width = settings.canvasWidth;
            receiverEl.height = settings.canvasHeight;

            videoEl.width = settings.canvasWidth;
            videoEl.height = settings.canvasHeight;

            if (!userMedia) {
                // damn, old browser :-(
                return alert('your browser is not supported');
            }

            document.getElementById('message').innerHTML = 'Sending: ' + transferRate + ' KB / Sec<br />';
            document.getElementById('message').innerHTML += 'Receiving: ' + transferRate + ' KB / Sec';

            // the stream is ready

            client.on('open', function (s) {

                stream = client.createStream(s, 'toserver');

                // data coming from the server...
                // we will draw it into the receiver canvas
            });

            // client.on('stream', function (s, meta) {

            //     // if (meta === 'fromserver') {
            //     //     // s.on('data', function (data) {

            //     //     //     // data is from the type 'ArrayBuffer'
            //     //     //     // we need to build a Uint8Array out of it
            //     //     //     // to be able to access the actual data

            //     //     //     var dataArr = new Uint8Array(data);

            //     //     //     for (var i = 0, len = dataArr.length; i < len; i++) {
            //     //     //         imageFrame.data[receiverPos] = dataArr[i];
            //     //     //         receiverPos++;
            //     //     //         if (receiverPos % receiverDataLength === 0) {
            //     //     //             receiverPos = 0;
            //     //     //             receiverContext.putImageData(imageFrame, 0, 0);
            //     //     //         }
            //     //     //     }

            //     //     // });

            //     // }
            // });

            // gets called in an certain interval and grabs the current video frame
            // and draws it into a canvas

            var grabLoop = function () {
                try {
                    senderContext.drawImage(videoEl, 0, 0, settings.canvasWidth, settings.canvasHeight);
                } catch (e) {}

                var imageData = senderContext.getImageData(0, 0, settings.canvasWidth, settings.canvasHeight);
                if (typeof stream !== 'undefined') {
                    stream.write(imageData.data);
                }
                setTimeout(grabLoop, settings.grabRate);
            };

Basically I want to replace the commented out code with and OpenFrameworks application which reads in the the binary stream and displays the video. I’m looking into accomplishing this with ofxLibwebsockets but I’m not sure it has the functionality I need. Does anyone know if this addon would work and/or has any suggestions for how to read in the video data?

ofxLibwebsockets should be able to do this – check the examples for a variety of server/client samples for streaming images or video as jpeg binary blobs. Note that the examples that have “blob” in the title also require ofxTurboJpeg

I have been exploring the examples at ofxLibwebsockets. I found the case of a Blob Server, but I didn’t find the javascript client for posting an image.

In my case, I am posting a snapshot from my webcam. This is my js client code:

	var ws = new WebSocket("ws://192.168.1.130:9093");
	ws.binaryType = "arraybuffer";

var ctx = canvas.getContext('2d');
ctx.drawImage(video, 0, 0);
    
	
	var image = ctx.getImageData(0, 0, canvas.width, canvas.height);
	var buffer = new ArrayBuffer(image.data.length);
	var bytes = new Uint8Array(buffer);

	for (var i=0; i<bytes.length; i++) {
		bytes[i] = image.data[i];
	}	

	ws.send(buffer);

On the OpenFrameworks server side, I’m using the code from the example “example_server_blob”. But, when I post the image, I receive the next error:

Error in tjDecompressHeader2():
Not a JPEG file: starts with 0xa5 0xb6

What am I doing wrong?

Hi @leefel. It looks like what you’re sending on the client side is raw/uncompressed pixel data (ie think an ofPixels object), but the server is expecting something that has been compressed as a jpeg. You can use the canvas.toDataURL() method to compress the pixel data to a variety of formats (jpeg, png, …). Here’s some more info:

Thank you for your fast response. I actually tried this alternative too:

var ws = new WebSocket("ws://192.168.1.130:9093");
	ws.binaryType = "arraybuffer";
	var ctx = canvas.getContext('2d');
	ctx.drawImage(video, 0, 0);
	var image = canvas.toDataURL("image/png");
ws.send(image);

In this case, on the server side, on the event onMessage:

void ofApp::onMessage(ofxLibwebsockets::Event& args) {
    ...
   	if (args.isBinary) {
	buff.clear();
	buff.set(args.data.getData(), args.data.size());
	locked = true;
	needToLoad = true;
}

Message is not recognized as binary. So I tried this, probably very wrong:

buff.clear();
buff.set(args.message);

locked = true;
needToLoad = true;

And in this case I receive a similar error on openFrameworks:

Error in tjDecompressHeader2():
Not a JPEG file: starts with 0x64 0x61

Hrm, not sure why the isBinary flag isn’t properly set on the server side. BUT you’ve got “image/png” instead of “image/jpeg” on the client side! Try changing that to “image/jpeg” and you might be good to go?

I changed it to image/jpeg:

    var ws = new WebSocket("ws://192.168.1.130:9093");
	ws.binaryType = "arraybuffer";

var ctx = canvas.getContext('2d');
ctx.drawImage(video, 0, 0);

	var image = canvas.toDataURL("image/jpeg");
	ws.send(image);

But still on the server

void ofApp::onMessage(ofxLibwebsockets::Event& args) {
    ...
   	if (args.isBinary) {

isBinary is false. args.data.size() is 0. The jpeg image, is placed in args.message, I don’t know why. But even if I try to assign the information from message to the buffer:

buff.clear();
buff.set(args.message);

locked = true;
needToLoad = true;

I still have the same error:

Error in tjDecompressHeader2():
Not a JPEG file: starts with 0x64 0x61

I tried at onMessage, to return the same message to the client:

args.conn.send(args.message);

And I assigned to an image on the page. And It works properly, so It looks like the information sent is correct:

ws.onmessage = function(e) {
		console.log("Receiving message");
		document.getElementById("streamed").src = e.data;
	}

So, right now the problem is why the image is arriving on the message attribute of the event.

thanks to the response by @robotconscience https://github.com/robotconscience/ofxLibwebsockets/issues/69#issuecomment-227734567

The method canvas.toBlob was the correct way to send the image for ofxTurboJpeg to read it.

So, the JS Client side, would be:

var ws = new WebSocket("ws://192.168.1.130:9093");
ws.binaryType = "blob";

var ctx = canvas.getContext('2d');
ctx.drawImage(video, 0, 0);

	canvas.toBlob(function(blob) {			
		ws.send(blob);
	}, "image/jpeg", 1);

And, on the server side, it would work perfectly with the code of “example_server_blob” from ofxLibwebsockets.

1 Like

Pleas send me a copy of this program at noahwalugembe@gmail.com