video green-screen

I’ve been adding a green-screen/chroma-key feature to video and live-cam content in my little projection mapping tool (see http://www.hv-a.com/lpmt)
The greenscreen color can be adjusted be user while program is running, and it has a threshold that can be adjusted as well.
I’ve implemented like this:

this code is run in update() function for each

  
// video greenscreen stuff  
            if (videoGreenscreen) {  
            // gets video frame pixels array  
            videoPixels = video.getPixels();  
            // checking for greenscreen color match  
            for (int i = 0; i < videoWidth*videoHeight; i++) {  
                int deltaR = abs(videoPixels[i*3+0] - colorGreenscreen.r*255);  
                int deltaG = abs(videoPixels[i*3+1] - colorGreenscreen.g*255);  
                int deltaB = abs(videoPixels[i*3+2] - colorGreenscreen.b*255);  
                if (deltaR <= thresholdGreenscreen && deltaG <= thresholdGreenscreen && deltaB <= thresholdGreenscreen) {  
                    videoAlphaPixels[i*4+3] = 0;  
                }  
                else {videoAlphaPixels[i*4+3] = 255;}  
                // RGB data is copied untouched  
                videoAlphaPixels[i*4+0] = videoPixels[i*3+0];  
                videoAlphaPixels[i*4+1] = videoPixels[i*3+1];  
                videoAlphaPixels[i*4+2] = videoPixels[i*3+2];  
                }  
            videoTex.loadData(videoAlphaPixels,videoWidth,videoHeight,GL_RGBA);  
            }  

basically for each frame I extract the pixels array, cycle through it, checking if each pixel matches the current user configured color for the greenscreen (can be ajusted through a slider in GUI) taking care of the user configured threshold as well (the abs(diff) trick is the easiest way I could think of).
If the pixel matches the range, it’s alpha valuesis set to zero, and pixels with alpha are then loaded to a new texture set to GL_RGBA mode.
In the draw() function, if the greenscreen bool value is set, the GL_RGBA is drawn, otherwise the original GL_RGB video texture is chosen.

This stuff works reasonably well for a certain number of contemporary medium-sized videos, but being rather CPU intensive I guess it could cause slowdowns in casr of hi-res videos.
So I was wondering if there is some OpenGL-fu that could do, or any kind of more efficient solution, and I would love any tip or help about it.
It should be something that can allow greenscreen color adjustment at runtime, and threshold configurability while program is running as well, and should affect a single video testure separately as well, and not affect the whole OpenGL window.
I’m a newbie with C++ and OF, and I’m rather confused even with OpenGL basics, so any help will be warmly welcome! :slight_smile:
thanks in advance

[quote author=“hv_francesco”]
So I was wondering if there is some OpenGL-fu that could do, or any kind of more efficient solution, and I would love any tip or help about it.
thanks in advance[/quote]

Simple shaders seems a straightforward improvement and allow simply move your logic to GPU – that’s the first thing I would do to make it more efficient.

[quote author=“boba”]
Simple shaders seems a straightforward improvement and allow simply move your logic to GPU – that’s the first thing I would do to make it more efficient.[/quote]

Good tip! I’ve been looking into it and it seems it’s not so difficult as I thought.
There are nice examples and I got a first working proof-of-concept, doing chromakey with a shader.
I’m a little concerned about instability of FBO and shader addons as they get integrated into main source of OF (If I didn’t get it wrong), but I have it working nicely at least on 062.

thanks for the suggestion.

Glad it helped. Did you use ofxShader? What is the efficiency gain like?

Thanks.

[quote author=“boba”]
Glad it helped. Did you use ofxShader? What is the efficiency gain like?

Thanks.[/quote]

From a first quick test, on my system is between 1.8 and 2.0 times faster, nice!
:smiley:

Seems to work well on my Desktop with NVIDIA gpu, but not on laptop with intel board.
the intel chip is supposed to work, but shaders do not compile on it. the program runs just fine, the fbo texture is used but no chromakey.
this is the fragment shader I’m using:

  
  
#extension GL_ARB_texture_rectangle : enable  
  
uniform sampler2DRect src_tex_unit0;  
uniform float greenscreenR;  
uniform float greenscreenG;  
uniform float greenscreenB;  
uniform float greenscreenT;  
vec4 color;  
  
void main( void )  
{  
        vec2 st = gl_TexCoord[0].st;  
        vec4 sample = texture2DRect(src_tex_unit0, st );  
        gl_FragColor = sample;  
	if((abs(sample.r-greenscreenR) < greenscreenT) && (abs(sample.g-greenscreenG) < greenscreenT) && (abs(sample.b-greenscreenB) < greenscreenT)) {  
                gl_FragColor.a = 0.0;  
        }  
}  

2 quick thoughts:

The Intel integrated card might not support ARB in shaders.

  
vec2 st = gl_TexCoord[0].st;  

this should probably be

  
vec2 st = gl_TexCoord[0].xy;  

ARB does xy i.e. pixel coords, not st, i.e. normalized.

[quote author=“joshuajnoble”]2 quick thoughts:

The Intel integrated card might not support ARB in shaders.
[/quote]

I guess you’re right… I should maybe re-write things without ARB…
that means having to go with normalized coords, right?
and OF uses arb textures as default I think…
I fear it would probably be a rather over-complicate thing for my present skills… Damn :slight_smile: I guess I will not have gpu based green-screening on laptop

[quote author=“joshuajnoble”]

  
vec2 st = gl_TexCoord[0].st;  

this should probably be

  
vec2 st = gl_TexCoord[0].xy;  

ARB does xy i.e. pixel coords, not st, i.e. normalized.[/quote]

Fixed that, thanks!

If you want to use non-ARB textures just do ofDisableArb(). Does it not work with the xy texture coords?