How to add a new method (isSeeking) to ofxEmscriptenVideoPlayer [solved]?

I made a videoplayer with Emscripten and already changed some code in library_html5video.js for drag and drop and a file browser: Pure-Data-Ofelia-Emscripten/library_html5video.js at main · Jonathhhan/Pure-Data-Ofelia-Emscripten · GitHub

Here is the player: http://videoplayer.handmadeproductions.de/

My issue is, every time the player searches for a new frame I have a black screen (I want to keep the last frame).
With the onseeking and onseeked event I can get the needed information for avoiding that, but I do not know how to call that value from ofxEmscriptenVideoPlayer and the main app (I guess I need to add a new method to the addon)…

(i need that value for conditionally update the videoplayer - only if searching a new frame is finished…)

I think in library_html5video.js it would look like that (it actually prints the messages and returns the value, the problem is that ofVideoPlayer.isSeeking() is not recognized):

    isSeeking = false;
    html5video_player_is_seeking: function(id){
	var player = VIDEO.players[id];
	player.onseeking = (event) => {
	isSeeking = true;
	console.log('Video is seeking a new position.');
	};
	player.onseeked = (event) => {
	isSeeking = false;
	console.log('Video found the playback position it was looking for.');
	};
        return isSeeking;
    },

Hi, just modify the file that contains that class, it is an addon, so it should be in <your_openframeworks_folder>/addons/ofxEmscriptenVideoPlayer/src/ save and compile again,

Hi, thank you. I already tried to modify ofxEmscriptenVideoPlayer.cpp and ofxEmscriptenVideoPlayer.h, maybe I was doing something wrong (actually it is part of ofxEmscripten: openFrameworks/addons/ofxEmscripten/src at master · openframeworks/openFrameworks · GitHub). Maybe my issue is, that the ofxEmscriptenVideoPlayer methods are called with ofVideoPlayer when using Emscripten (I also tried to add the method to ofBaseTypes and ofVideoPlayer)? Perhaps I have to add, that I use Open Frameworks with Pure Data / Ofelia GitHub - cuinjune/Ofelia: A real-time cross-platform creative coding tool for multimedia development which could make it more difficult to add additional methods.

No sucess with adding the additional method, so far (I already added a simpler addon to Ofelia(ofxVolumetrics), so in theory it works). I guess, what makes it more complicated is, that ofxEmscriptenVideoPlayer is calling ofBaseVideoPlayer methods (sorry for my the unprecise terminology):
class ofxEmscriptenVideoPlayer: public ofBaseVideoPlayer{ public: some methods here... private: };
and because isSeeking() is not part of it, videoPlayer:isSeeking() always returns nil/null? And if I try to add ofxEmscriptenVideoPlayer directly, the draw() and update() do not work anymore. Maybe there is a simpler way to keep the last frame until the new frame is found, and I do not need to add a new method at all?

Hey @Jona, I’ve been following this thread a bit. Yeah you should be able to keep the old frame around until you get a new one. Maybe something like this:

// in ofApp.h
// the usual stuff, then:
    ofxEmscriptenVideoPlayer player;
    ofTexture texture;

// in ofApp::update()
    player.update();
    // store the texture if the frame is new
    if(player.isFrameNew())
    {
        // dereferene the pointer returned by .getTexture()
        // I like a copy here, in case the player overwrites the texture before ofApp.draw()
        texture = *(player.getTexture());
    }

// in ofApp::draw(), draw the same frame stored in the texture until player.isFrameNew() returns true
    texture.draw();

I tried to test this, but I don’t have the right ofxEmscripten header files on linux. But I’m thinking this should work great if dereferencing the texture pointer works with assignment.

Hey @TimChi, thank you very much. Your very simple solution solved my issue and now it works as expected (and since I was already drawing the video player into an fbo I just had to place that below the update condition):

if videoPlayer:getPosition() ~= lastPosition and videoPlayer:isLoaded() == true then;
lastPosition = videoPlayer:getPosition();
videoPlayer:update();
fbo:beginFbo();
ofClear(255, 255, 255, 0);
videoPlayer:draw(0, 0, 800, 600);
fbo:endFbo();
end;

https://videoplayer.handmadeproductions.de/

1 Like

Another issue is, that i need to choose a fixed aspect ratio (800x600 in my case) to avoid crashes, if I play a video that is larger than the last one (memory out of bounds error).
It looks like:

VIDEO.players[player_id].width = 800;
VIDEO.players[player_id].height = 600;
var videoImage = document.createElement( 'canvas' );
videoImage.width = 800;
videoImage.height = 600;
var videoImageContext = videoImage.getContext( '2d' );
// background color if no video present
videoImageContext.fillStyle = '#000000';
videoImageContext.fillRect( 0, 0, videoImage.width, videoImage.height );

instead of (part of the source code: openFrameworks/library_html5video.js at master · openframeworks/openFrameworks · GitHub):

VIDEO.players[player_id].width = this.videoWidth;
VIDEO.players[player_id].height = this.videoHeight;
var videoImage = document.createElement( 'canvas' );
videoImage.width = this.videoWidth;
videoImage.height = this.videoHeight;
var videoImageContext = videoImage.getContext( '2d' );
// background color if no video present
videoImageContext.fillStyle = '#000000';
videoImageContext.fillRect( 0, 0, videoImage.width, videoImage.height );

The disadvantage is, that I can not set the right aspect ratio of the video (if I do not know beforehand what the aspect ratio is). It would be nice, to use the variable aspect ratio without errors (or use a fixed aspect ratio, but pass this.videoWidth and this.videoHeight to Open Frameworks for scaling the video).

I’ll try and help a bit with this, though I haven’t worked with emscripten at all.

But, if you can get each frame into an ofTexture, you can create an ofImage from that and then resize and/or crop the ofImage to whatever you want it to be. So, maybe something like this (In psudeo code):

ofTexture texture;
ofImage image;
// get the video frame into the texture
image.setFromPixels(texture.readToPixels());
// crop and/or resize the image
image.update();

If the ofTexture is a class variable and if the video size changes, you may need to clear and allocate it to the proper size, which (I’m thinking) you should be able to get from the video player.

@TimChi Thank you again. I will try that this evening. A drawback could be, that it is slower, because it passes data from GPU to memory every frame? Another idea is to call this.videoWidth and this.videoHeight from OF without changing the video players size (I guess, I would need those values with your suggestion too)… What I have to add is, it does not crash if I load videos with different sizes, if I do not start to play the video (it still draws the first frame). Maybe it starts to buffer parts of the video if it plays (which is the case with the html5 video player, on which the ofxEmscriptenVideoPlayer is based on)?

Hey you could also try a shader and keep it all on the gpu. Maybe a fragment shader could sample the texture from the video player and set the colors in another texture with a different size. Setting up an oF app to use shaders is kinda handy too, especially if you want to do some post-processing work (blur, sharpen, contrast, bloom, color enhancement, etc).

Hey @TimChi, I already use shaders with the videoPlayer texture. I guess OF should only update the videoPlayer if the new size is available and not if the new video source is connected (which happens slightly before).
Here is the main code (it looks a bit different, because it is written with Pure Data / Ofelia. ofelia.bang() is like draw() in this case):

if type(window) ~= "userdata" then;
window = ofWindow();
end;
;
local canvas = ofCanvas(this);
local clock = ofClock(this, "setup");
local shaderDir = canvas:getDir() .. "/data/";
local videoPlayer = ofVideoPlayer();
local shader = ofShader();
local fbo = ofFbo();
local outputList = {};
;
function ofelia.new();
ofWindow.addListener("setup", this);
ofWindow.addListener("update", this);
ofWindow.addListener("draw", this);
ofWindow.addListener("exit", this);
window:setPosition(30, 100);
window:setSize(840, 860);
if ofWindow.exists then;
clock:delay(0);
else;
window:create();
end;
end;
;
function ofelia.free();
window:destroy();
ofWindow.removeListener("setup", this);
ofWindow.removeListener("update", this);
ofWindow.removeListener("draw", this);
ofWindow.removeListener("exit", this);
end;
;
function ofelia.setup();
ofBackground(0);
ofSetWindowTitle("Video Player");
videoPlayer:setLoopState(OF_LOOP_NORMAL);
videoPlayer:load("dummy");
videoPlayer:setUseTexture(true);
fbo:allocate(800, 600);
ofSetFrameRate(25);
end;
;
function ofelia.glsl(s);
shader:load(shaderDir .. s);
end;
;
function ofelia.setPosition(f);
videoPlayer:setPosition(f);
end;
;
function ofelia.setTempo(f);
videoPlayer:setSpeed(f / 100);
end;
;
function ofelia.stop();
videoPlayer:stop();
videoPlayer:setPosition(0);
end;
;
function ofelia.start(f);
if f == 1 then;
videoPlayer:play();
else;
videoPlayer:stop();
end;
end;
;
function ofelia.setVolume(f);
videoPlayer:setVolume(f / 100);
end;
;
function ofelia.loadVideo();
videoPlayer:load("dummy");
end;
;
function ofelia.update();
if videoPlayer:isLoaded() == true then;
videoPlayer:update();
end;
end;
;
function ofelia.bang();
local mouseX = ofGetMouseX();
local mouseY = ofGetMouseY();
shader:beginShader();
if videoPlayer:getTexture():isAllocated() == true then;
shader:setUniformTexture("Tex0", videoPlayer:getTexture(), 0);
end;
shader:setUniform1f("time", ofGetElapsedTimeMillis());
shader:setUniform2f("resolution", 800, 600);
shader:setUniform2f("center", 800 / 2, 600 / 2);
shader:setUniform2f("mouse", mouseX, mouseY);
;
shader:setUniform1f("Bleach_Opacity", ofelia.Bleach_Opacity);
shader:setUniform1f("Bokeh_Bias", ofelia.Bokeh_Bias / 100);
shader:setUniform1f("Bokeh_Focus", ofelia.Bokeh_Focus / 100);
shader:setUniform1f("Contrast_Contrast", ofelia.Contrast_Contrast / 100);
shader:setUniform1f("Contrast_Multiple", ofelia.Contrast_Multiple / 100);
shader:setUniform1f("Contrast_Brightness", ofelia.Contrast_Brightness / 100);
shader:setUniform1f("CircleWarp_Rotation", ofelia.CircleWarp_Rotation / 100);
shader:setUniform1f("CircleWarp_Radius", ofelia.CircleWarp_Radius / 200);
shader:setUniform1f("GaussianBlur_Radius", ofelia.GaussianBlur_Radius / 100);
shader:setUniform1f("Grain_Intensity", ofelia.Grain_Intensity);
shader:setUniform1f("GreyScale_Factor", ofelia.GreyScale_Factor / 100);
shader:setUniform1f("HSBShift_Hue", ofelia.HSBShift_Hue / 100);
shader:setUniform1f("HSBShift_Brightness", ofelia.HSBShift_Brightness / 100);
shader:setUniform1f("HSBShift_Saturation", ofelia.HSBShift_Saturation / 100);
shader:setUniform1f("Kaleidoscope_Segments", ofelia.Kaleidoscope_Segments);
shader:setUniform1f("Kaleidoscope2_Sides", ofelia.Kaleidoscope2_Sides);
shader:setUniform1f("Kaleidoscope2_Angle", ofelia.Kaleidoscope2_Angle / 100);
shader:setUniform1f("Kaleidoscope2_SlideX", ofelia.Kaleidoscope2_SlideX / 100);
shader:setUniform1f("Kaleidoscope2_SlideY", ofelia.Kaleidoscope2_SlideY / 100);
shader:setUniform1f("MetaImage_MosaicNumber", ofelia.MetaImage_MosaicNumber);
shader:setUniform1f("MetaImage_ColorFade", ofelia.MetaImage_ColorFade / 100);
shader:setUniform1f("MosaicColor_MosaicNumber", ofelia.MosaicColor_MosaicNumber);
shader:setUniform1f("NoiseWarp_Frequency", ofelia.NoiseWarp_Frequency);
shader:setUniform1f("NoiseWarp_Amplitude", ofelia.NoiseWarp_Amplitude / 100);
shader:setUniform1f("NoiseWarp_Speed", ofelia.NoiseWarp_Speed / 10000);
shader:setUniform1f("Pixelate_XPixels", ofelia.Pixelate_XPixels);
shader:setUniform1f("Pixelate_YPixels", ofelia.Pixelate_YPixels);
shader:setUniform1f("PositionShift_RateX", ofelia.PositionShift_RateX / 100);
shader:setUniform1f("PositionShift_RateY", ofelia.PositionShift_RateY / 100);
shader:setUniform1f("RGBShift_Angle", ofelia.RGBShift_Angle / 100);
shader:setUniform1f("RGBShift_Amount", ofelia.RGBShift_Amount / 100);
shader:setUniform1f("VertlTiltShift_V", ofelia.VertlTiltShift_V / 100);
shader:setUniform1f("VertlTiltShift_R", ofelia.VertlTiltShift_R / 100);
fbo:beginFbo();
ofClear(0);
ofDrawRectangle(0, 0, 800, 600);
fbo:endFbo();
shader:endShader();
fbo:draw(20, 20 + (600 - 800 / (videoPlayer:getWidth() / videoPlayer:getHeight())) / 2, 800, 800 / (videoPlayer:getWidth() / videoPlayer:getHeight()) );
outputList[1] = videoPlayer:getPosition();
outputList[2] = videoPlayer:getDuration();
return outputList;
end;
;
function ofelia.exit();
videoPlayer:close();
shader:unload();
fbo:clear();
end;

Here is the version that chrashes sometimes (if a video is loaded that is smaller than the last one): https://videoplayerchrome.handmadeproductions.de/

1 Like

You may already know this, but one thing you can do with the texture in the shader is to .draw() it directly in the fbo instead of the rectangle. The shader will get the texture as tex0 by default. Just .draw() one texture (tex0) and send any others in with .setUniformTexture(). Calling .draw() would let you use any relevant arguments (like width, height, position, etc) and should also provide texcoords from the texture. I’ll often center and scale something in an fbo when I draw it.

Like I think you’ve found, there are probably some nuances about updating the video player before it plays a new video. If you need too, you can always create a new one for each video. Just don’t forget to delete them when you’re done with them.

This looks like a fun project! Lots of interesting stuff happening in the shader from the looks of it!

@TimChi thank you. I adapted those shaders for Pure Data, so basically they are not my own work, but helped me a lot to learn about shaders. My two main ideas for this project are: Make it possible to compile Pure Date / Ofelia patches easily for the web (webmidi is another thing that I tried to embed). And (specific for this patch) to do browser based algorithmic film / video editing. For now, I try to figure out the basics.

1 Like

I found a solution for the aspect ratio of the video (not sure if it is a good one, but it works). I use a fixed texture size (800x600) and pass the aspect ratio values separately. For that I had to change some things in ofxEmscriptenVideoPlayer.cpp:

void ofxEmscriptenVideoPlayer::update(){
	gotFirstFrame = pixels.isAllocated();
	if(html5video_player_update(id,pixels.isAllocated() && usePixels,pixels.getData())){
		if(texture.texData.width!=800 || texture.texData.height!=600){
			texture.texData.width = 800;
			texture.texData.height =  600;
			texture.texData.tex_w = texture.texData.width;
			texture.texData.tex_h = texture.texData.height;
			switch(getPixelFormat()){
			case OF_PIXELS_RGBA:
				pixels.allocate(texture.texData.width,texture.texData.height,4);
				break;
			case OF_PIXELS_RGB:
				pixels.allocate(texture.texData.width,texture.texData.height,3);
				break;
			case OF_PIXELS_MONO:
				pixels.allocate(texture.texData.width,texture.texData.height,1);
				break;
			default:
				ofLogError() << "unknown pixel format, can't allocating texture";
				break;
			}
		}
		if(texture.texData.textureID!=html5video_player_texture_id(id)){
			texture.texData.bFlipTexture = false;
			switch(getPixelFormat()){
			case OF_PIXELS_RGBA:
				texture.texData.glInternalFormat = GL_RGBA;
				break;
			case OF_PIXELS_RGB:
				texture.texData.glInternalFormat = GL_RGB;
				break;
			case OF_PIXELS_MONO:
				texture.texData.glInternalFormat = GL_LUMINANCE;
				break;
			default:
				ofLogError() << "unknown pixel format, can't allocating texture";
				break;
			}
			texture.texData.tex_u = 1;
			texture.texData.tex_t = 1;
			texture.texData.textureTarget = GL_TEXTURE_2D;
			texture.texData.bAllocated = true;
			texture.setUseExternalTextureID(html5video_player_texture_id(id));
		}
	}
}
float ofxEmscriptenVideoPlayer::getWidth() const{
	return html5video_player_width(id);
}

float ofxEmscriptenVideoPlayer::getHeight() const{
	return html5video_player_height(id);
}

and library_html5video.js

var LibraryHTML5Video = {
    $VIDEO: {
        players: [],
        playersContexts: [],
        playersCounter: 0,

        getNewPlayerId: function() {
          var ret = VIDEO.playersCounter++;
          return ret;
        },

        grabbers: [],
        grabbersContexts: [],
        grabbersCounter: 0,

        getNewGrabberId: function() {
          var ret = VIDEO.grabbersCounter++;
          return ret;
        },

        getUserMedia: function(){
        	return navigator.mediaDevices.getUserMedia ||
        	    navigator.mediaDevices.webkitGetUserMedia ||
        	    navigator.mediaDevices.mozGetUserMedia ||
        	    navigator.mediaDevices.msGetUserMedia;
        },

        update: function(updatePixels, video, context, dstPixels){
        	if((updatePixels || video.pixelFormat!="RGBA") && video.width!=0 && video.height!=0 && dstPixels!=0){
        		try {
	            	context.drawImage( video, 0, 0, video.width, video.height );
	            	imageData = context.getImageData(0,0,video.width,video.height);
	            	srcPixels = imageData.data;
	            	if (video.pixelFormat=="RGBA"){		            	
		            	array = Module.HEAPU8.subarray(dstPixels, dstPixels+(video.width*video.height*4));
				array.set(imageData.data);
	            	}else if(video.pixelFormat=="RGB"){
		            	array = Module.HEAPU8.subarray(dstPixels, dstPixels+(video.width*video.height*3));
		            	for(var i=0, j=0; i<array.length; ){
		            		array[i++] = srcPixels[j++];
		            		array[i++] = srcPixels[j++];
		            		array[i++] = srcPixels[j++];
		            		++j;
		            	}
		                GLctx.bindTexture(GLctx.TEXTURE_2D, GL.textures[video.textureId]);
		                GLctx.texImage2D(GLctx.TEXTURE_2D, 0, GLctx.RGB, video.width, video.height, 0, GLctx.RGB, GLctx.UNSIGNED_BYTE, array);
		                GLctx.bindTexture(GLctx.TEXTURE_2D, null);
	            	}else if(video.pixelFormat=="GRAY"){
		            	array = Module.HEAPU8.subarray(dstPixels, dstPixels+(video.width*video.height));
		            	for(var i=0, j=0; i<array.length; ){
		            		array[i++] = (((srcPixels[j++]|0) << 1) + ((srcPixels[j]|0) << 2) + (srcPixels[j++]|0) + (srcPixels[j++]|0)) >> 3;
		            		++j;
		            	}

		                GLctx.bindTexture(GLctx.TEXTURE_2D, GL.textures[video.textureId]);
		                GLctx.texImage2D(GLctx.TEXTURE_2D, 0, GLctx.LUMINANCE, video.width, video.height, 0, GLctx.LUMINANCE, GLctx.UNSIGNED_BYTE, array);
		                GLctx.bindTexture(GLctx.TEXTURE_2D, null);
	            	}
        		}catch(e){console.log(e);}
        	}
        	if (video.pixelFormat=="RGBA"){
                GLctx.bindTexture(GLctx.TEXTURE_2D, GL.textures[video.textureId]);
                GLctx.texImage2D(GLctx.TEXTURE_2D, 0, GLctx.RGBA, GLctx.RGBA, GLctx.UNSIGNED_BYTE, video);
                GLctx.bindTexture(GLctx.TEXTURE_2D, null);
        	}
        }
    },

    html5video_player_create: function(){
        var video = document.createElement('video');
        var player_id = VIDEO.getNewPlayerId();
        VIDEO.players[player_id] = video;
	var texId = GL.getNewId(GL.textures);
	var texture = GLctx.createTexture();
	texture.name = texId;
	GL.textures[texId] = texture;
	GLctx.bindTexture(GLctx.TEXTURE_2D, texture);
	GLctx.texParameteri(GLctx.TEXTURE_2D, GLctx.TEXTURE_MAG_FILTER, GLctx.LINEAR);
	GLctx.texParameteri(GLctx.TEXTURE_2D, GLctx.TEXTURE_MIN_FILTER, GLctx.LINEAR);
	GLctx.texParameteri(GLctx.TEXTURE_2D, GLctx.TEXTURE_WRAP_S, GLctx.CLAMP_TO_EDGE);
	GLctx.texParameteri(GLctx.TEXTURE_2D, GLctx.TEXTURE_WRAP_T, GLctx.CLAMP_TO_EDGE);
	VIDEO.players[player_id].textureId = texId;
		
	window.ondragover = function(e) {
	e.preventDefault();
        }
	window.ondrop = function(e) {
	e.preventDefault();
        console.log("Reading...");
        var length = e.dataTransfer.items.length;
        if (length > 1) {
            console.log("Please only drop 1 file.")
        } else {
            upload(e.dataTransfer.files[0])
        }
    }

    function upload(file) {
        if (file.type.match(/video\/*/)) {
            URL.revokeObjectURL(VIDEO.players[player_id].src)
            VIDEO.players[player_id].src = URL.createObjectURL(file);
            console.log("This file seems to be a video.")
            } else {
                console.log("This file does not seem to be a video.")
            }
        }

        video.onloadedmetadata = function (e){
            console.log(this.videoWidth + 'x' + this.videoHeight);
            VIDEO.players[player_id].returnWidth = this.videoWidth;
	    VIDEO.players[player_id].returnHeight = this.videoHeight;
            VIDEO.players[player_id].width = 800;
            VIDEO.players[player_id].height = 600;
	    var videoImage = document.createElement( 'canvas' );
	    videoImage.width = 800;
	    videoImage.height = 600;
	    var videoImageContext = videoImage.getContext( '2d' );
	    // background color if no video present
	    videoImageContext.fillStyle = '#000000';
	    videoImageContext.fillRect( 0, 0, videoImage.width, videoImage.height );
	    VIDEO.playersContexts[player_id] = videoImageContext;
	    VIDEO.players[player_id].currentTime = 0;
	    }

	return player_id;
    },

    html5video_player_delete: function(id){
    	VIDEO.players[id] = null;
    },
    html5video_player_load: function(id,src){     
	var input = document.createElement("input");
	input.type = "file";
	input.onchange = function (e){
        var file = e.target.files[0];
        if (file.type.match(/video\/*/)) {
            URL.revokeObjectURL(VIDEO.players[id].src)
            VIDEO.players[id].src = URL.createObjectURL(file);
            console.log("This file seems to be a video.")
        } else {
            console.log("This file does not seem to be a video.")
        }
    }
    input.click()
    },

    html5video_player_pixel_format: function(id){
        return allocate(intArrayFromString(VIDEO.players[id].pixelFormat), 'i8', ALLOC_STACK);
    },

    html5video_player_set_pixel_format: function(id, format){
        VIDEO.players[id].pixelFormat = UTF8ToString(format);
    },

    html5video_player_update__deps: ['$GL'],
    html5video_player_update: function(id,update_pixels,pixels){
        var player = VIDEO.players[id];
        var array;
        var imageData;
        var data;
        if ( player.readyState === player.HAVE_ENOUGH_DATA ) {
        	VIDEO.update(update_pixels, player, VIDEO.playersContexts[id], pixels);
            return true;
        }else{
        	return false;
        }
    },

    html5video_player_texture_id: function(id){
        return VIDEO.players[id].textureId;
    },

    html5video_player_width: function(id){
        return VIDEO.players[id].returnWidth;
    },

    html5video_player_height: function(id){
        return VIDEO.players[id].returnHeight;
    },

    html5video_player_play: function(id){
        VIDEO.players[id].play();
    },

    html5video_player_pause: function(id){
        VIDEO.players[id].pause();
    },

    html5video_player_stop: function(id){
        VIDEO.players[id].pause();
    },

    html5video_player_is_paused: function(id){
        return VIDEO.players[id].paused;
    },

    html5video_player_ready_state: function(id){
        return VIDEO.players[id].readyState;
    },

    html5video_player_duration: function(id){
        return VIDEO.players[id].duration;
    },

    html5video_player_current_time: function(id){
        return VIDEO.players[id].currentTime;
    },

    html5video_player_set_current_time: function(id, time) {
        if ( VIDEO.players[id].readyState === VIDEO.players[id].HAVE_ENOUGH_DATA ) {
            VIDEO.players[id].currentTime = time;
        } 
    },

    html5video_player_ended: function(id){
        return VIDEO.players[id].ended;
    },

    html5video_player_playback_rate: function(id){
        return VIDEO.players[id].playbackRate;
    },

    html5video_player_set_playback_rate: function(id,rate){
        VIDEO.players[id].playbackRate = rate;
    },

    html5video_player_volume: function(id){
        return VIDEO.players[id].volume;
    },

    html5video_player_set_volume: function(id,volume){
        VIDEO.players[id].volume = volume;
    },

    html5video_player_set_loop: function(id,loop){
        VIDEO.players[id].loop = loop;
    },

    html5video_player_loop: function(id){
        return VIDEO.players[id].loop;
    },

    html5video_grabber_create: function(){

	        var video = document.createElement('video');
			video.autoplay=true;
			video.pixelFormat = "RGB";

	        var grabber_id = VIDEO.getNewGrabberId();
	        VIDEO.grabbers[grabber_id] = video;
	        return grabber_id;

    },

    html5video_grabber_init__deps: ['$GL'],
    html5video_grabber_init: function(id, w, h, framerate){
    	if(id!=-1){
        	VIDEO.grabbers[id].width = w;
        	VIDEO.grabbers[id].height = h;

    	    var videoImage = document.createElement( 'canvas' );
    	    videoImage.width = w;
    	    videoImage.height = h;

    	    var videoImageContext = videoImage.getContext( '2d' );
    	    // background color if no video present
    	    videoImageContext.fillStyle = '#000000';
    	    videoImageContext.fillRect( 0, 0, w, h );

    	    VIDEO.grabbersContexts[id] = videoImageContext;

    		var errorCallback = function(e) {
    			console.log('Couldn\'t init grabber!', e);
    		};

    		if(framerate==-1){
    			var constraints = {
	    			video: {
		    		    mandatory: {
		    		        maxWidth: w,
		    		        maxHeight: h
		    		    }
	    		    }
    			};
    		}else{
    			var constraints = {
	    			video: {
		    		    mandatory: {
		    		        maxWidth: w,
		    		        maxHeight: h,
		    		    },
    					optional: [
    					    { minFrameRate: framerate }
		    		    ]
	    		    }
    			};
    		}

        navigator.mediaDevices.getUserMedia(constraints)
        .then(function(stream) {
          window.stream = stream;
          VIDEO.grabbers[id].srcObject = stream
          VIDEO.grabbers[id].onloadedmetadata = function (e){
            VIDEO.grabbers[id].play();
          }
        })
        .catch(function(err) {
          console.log(e);
        });

    	}
    },

    html5video_grabber_pixel_format: function(id){
        return allocate(intArrayFromString(VIDEO.grabbers[id].pixelFormat), 'i8', ALLOC_STACK);
    },

    html5video_grabber_set_pixel_format: function(id, format){
        VIDEO.grabbers[id].pixelFormat = UTF8ToString(format);
    },

    html5video_grabber_update__deps: ['$GL'],
    html5video_grabber_update: function(id,update_pixels,pixels){
        var grabber = VIDEO.grabbers[id];
        if ( grabber.readyState >= grabber.HAVE_METADATA ) {
        	VIDEO.update(update_pixels, grabber, VIDEO.grabbersContexts[id], pixels);
            return true;
        }else{
        	return false;
        }
    },

    html5video_grabber_texture_id: function(id){
        return VIDEO.grabbers[id].textureId;
    },

    html5video_grabber_width: function(id){
        return VIDEO.grabbers[id].width;
    },

    html5video_grabber_height: function(id){
        return VIDEO.grabbers[id].height;
    },

    html5video_grabber_ready_state: function(id){
        return VIDEO.grabbers[id].readyState;
    },


}


autoAddDeps(LibraryHTML5Video, '$VIDEO');
mergeInto(LibraryManager.library, LibraryHTML5Video);

1 Like