ofxShader examples

Here are a couple of shader examples that people might find useful. I thought this thread could be used as a place to post shader example projects.

Both these examples are fragment shaders, ie shaders that operate on texture/pixel data.
They both use FBOs to render something to a texture which then has the shader applied.

**Note: these both use the non-power-of-2 arb textures that OF uses by default. This means that tex coordinates are in pixels not 0-1.0 range. **

1) Blur Shader

The first is a blur shader. It does the blur in two passes as described here:
http://www.gamerendering.com/2008/10/11-…-er-shader/

It also ping pongs the shader - which is another way of saying that it runs the shader multiple times rendering from FBO1 → FBO2 → FBO1 etc. By ping ponging the shader you get much nicer results.

To use it, it is as simple as:

  
	  
	blur.setBlurParams(4, (float)mouseX / 100.0);  
	blur.beginRender();  
  
            //DRAW SOME STUFF HERE  
  
	blur.endRender();  
	blur.draw(0, 0, 640, 480, true);  
  

To get a sort of glow effect try playing with the averaging of the pixel values in the .frag files
changing color /= 25.0; → color /= 23.0; for example gives things a slight bloom / glow effect

2) Zoom / Magnify shader

This one is a little less involved. You render some graphics into an FBO and then specify a circle somewhere on the screen with some zoom parameters. The shader then operates only on the area in the circle and transforms those pixels with a magnifying glass sort of effect.

Here are 006 xcode projects for both examples. Should be easy to hookup for CB or VS too.

shaderBlurExample.zip

shaderZoomExample.zip

really helpful shader tutorial
thanks a bunch :slight_smile:

Thank you very much

but appears sampler2DRect requires “#140
i am PC and Nvidia GForce 9500 GT

sampler2Drect is really old as far as I know, so it should easily work on a decent card.

It should totally work on an NVIDIA 9500GT - works on my NVIDIA 9400 which is a cheaper card.

You might not have your NVIDIA drivers installed.

I personally like getting my nvidia drivers from laptopvideo2go.com - http://laptopvideo2go.com/drivers

But you could also grab them from the nvidia site.

Theo

that zoom shader is awesome!!!

nice work theo!

thanks Theo for sharing your shader examples.

It has allowed me to quickly put together some code that uses a light scattering shader.
All the info of how to do and the shader itself come from here
This example is mixed with some openGL lighting code from the openGL blue book.

It’s working, but I can not run it faster than 30fps (on a mac book pro - NVIDIA geforce 9600M GT).
What would be the possible optimizations ?

Here is the 006 xcode project. (I was not able to attach it)

Nice example!
Thanks for posting it.

It seems like the slow part is having each pixel in your texture looking at 100 other pixels to get its color -

  
for(int i=0; i < NUM_SAMPLES ; i++)  
	{  
			textCoo -= deltaTextCoord;  
  
  

Dropping NUM_SAMPLES down to 20 makes the fps shoot up.
Maybe there could be a way to do it where you ping pong the fbo / effect and to more passes but with less NUM_SAMPLES per pass?

Hrm, tried compiling these examples for windows / codeblocks and got a few strange results.

For the blur shader, I got a really dark red image. (not blurred)
for the zoom shader, i appear to get a normal or perhaps slightly tinted webcam image.

the app reports that its compiling and doing the shader correctly, but obviously its not.

OS is windows vista. Hardware is ATI X1950 pro.

Any one have any ideas, or has anyone found similar results?

Ive got the same problem. Ive tested the blur program in my laptop (integrated Intel) and in a desktop PC with a Nvidia 7600. I see the webcam but its very dark and red. And no blur at all.

Hi, i tried to use in windows but the memory die, can i used in windows or only is for MAC :smiley:

got the fix.

you have to comment as followed “return” in ofxShader
in glGetInfoLogARB(vertexShader …
and
glGetInfoLogARB(fragmentShader …

in fact only the vertex was loaded

glGetInfoLogARB(vertexShader, 999, &infobufferlen, infobuffer);
if (infobufferlen != 0){
infobuffer[infobufferlen] = 0;
printf(“vertexShader reports: %s \n”, infobuffer);
// return;
}

Thanks to the writter of the original code !

i commented this,but dothe same, the zoom,does`t work,and blur apper red, image…

any idea??
i works with windows andcodeblocks…

thx

Hi all

i read about problem of

[quote author=“fishkingsin”]Thank you very much
but appears sampler2DRect requires “#140
i am PC and Nvidia GForce 9500 GT[/quote]

i have same problem and look like

[quote author=“drosen”]Hrm, tried compiling these examples for windows / codeblocks and got a few strange results.

For the blur shader, I got a really dark red image. (not blurred)
for the zoom shader, i appear to get a normal or perhaps slightly tinted webcam image.

the app reports that its compiling and doing the shader correctly, but obviously its not.

OS is windows vista. Hardware is ATI X1950 pro.

Any one have any ideas, or has anyone found similar results?[/quote][quote author=“Netich”]Ive got the same problem. Ive tested the blur program in my laptop (integrated Intel) and in a desktop PC with a Nvidia 7600. I see the webcam but its very dark and red. And no blur at all.[/quote]

and found the solution is simple

in windows you need to put

  
#extension GL_ARB_texture_rectangle : enable  

before

  
uniform sampler2DRect src_tex_unit0;  

in all .frag, in the case of example of theo, in simpleblurhorizontal, simpleblur vertical, and zoom . frag, and all works ok

REMEMBER

  
#extension GL_ARB_texture_rectangle : enable  
uniform sampler2DRect src_tex_unit0;  

Theo can you make this change in Mac and tell me i only works on windows, but i read something about this for mac

thx for all :wink:

Hey thanks for the fix!
Works on mac - attached are the two examples ( for 0.06 )

_shaderBlurExampleUpdated.zip

_shaderZoomExampleUpdated.zip

hey theo, hey Kbronsito,

thanks for the nice examples.

i have the same problem under windows but also with the new shader files.
just a dark red circle and nothing blur.

any ideas?

edit: after reading this thread once again, another question:
is there a need for nvidia? just have intel onboad.

greetz,
fxlange

yes if you have an on-board intel card, you most likely don’t have shader support, but I could be wrong. I haven’t followed intel on board chips in a while.

you can check out this opengl extension / capabilities viewer (there are several apps like this for windows) that should give you a sense of what your card can do:

http://www.ozone3d.net/gpu-caps-viewer/

hope that helps…

take care,
zach

thanks zach,

yes that helps. usefull tool.

i will try it tomorrow with an other pc(with nvidia card).

greetz

Thanks for the examples!

I tried to get the blur shader example to work on openFrameworks 0.061 on Ubuntu 9.10 64bit with the circle (not the camera)

I think this might be a tiny bug in the blur shader (simpleBlurVertical.frag, line 19)

  
  
color += 5.0 * texture2DRect(src_tex_unit0, st + vec2(0.0, blurAmnt) );  
  

Should be:

  
  
color += 5.0 * texture2DRect(src_tex_unit0, st + vec2(0.0, 0.0) );  
  

Also, for me, I needed to add the following or the pixel wouldn’t show if its alpha value was under 1.0

  
  
color.a = 1.0;  
  

Hi,

I’m having some trouble getting the blur example to compile with .061 on Snow Leopard. I’m getting a bunch of errors that it can’t find fmod.h, which is odd because I have that file in my directory and, besides, I can’t tell where this example even uses it. Any ideas?

Thanks,
Zach