Bokeh (camera lens blur) shader

Hi, i just came up with a shader that simulates the bokeh blur where brighter spots will produce circles in an unfocused image:


void main(void) {  
    gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;  
    gl_TexCoord[0] = gl_MultiTexCoord0;  


uniform sampler2DRect tex;  
uniform float max_radius;  
void main(void) {  
    vec4 finalColor = vec4(0.0,0.0,0.0,1.0);  
    float weight = 0.;//vec4(0.,0.,0.,0.);  
    int radius = int(max_radius);  
    for(int x=radius*-1;x<radius;x++) {  
        for(int y=radius*-1;y<radius;y++){  
            vec2 coord = gl_TexCoord[0].xy+vec2(x,y);  
            if(distance(coord, gl_TexCoord[0].xy) < float(radius)){  
                vec4 texel = texture2DRect(tex, coord);  
                float w = length(texel.rgb)+0.1;  
                finalColor += texel*w;  
    gl_FragColor = finalColor/weight;  

could probably be optimized a bit. I bet that this could be combined with a kinect depth image to control the blur amount to produce a nice depth of field too.

![]( shader.png)

1 Like

Nice, but this is applied onto the entire texture right? we follow your development of your idea!

Yes, you specify the texture to use as a uniform. In this case I used a videoGrabber’s texture.

I don’t know if it can be done without a texture, as you need to sample surrounding pixel’s.
This is how you would use it in draw():

    testShader.setUniformTexture("tex", grabber, 0);  
    testShader.setUniform1f("max_radius", 7.0);  

Thanks for posting the shader. I implemented it as a core image filter in Quartz Composer to play around with, if anyone is interested:

Can you please post a sample project (possibly in Xcode). I’m get pretty tired searching how get bokeh effect in my Camera project for iOS7.

Thank you

I believe a number of addons have pulled this code in. ofxFX by @patriciogonzalezvivo is one of them and should have an example you can use (example-filters):

Sorry for reviving an old topic.

How can this code work? Isn’t gl_TexCoord range [0, 1]? If so, adding vec2(x, y) to it makes the resulting coord to always point to edge of the texture.

In adapting this code for an OpenGL 2.2 app, I had to use:

highp vec2 coord = texcoordVarying + vec2(x,y) * 1.0/(512.0);

instead of

highp vec2 coord = texcoordVarying + vec2(x,y);

for it to look reasonable.

This shader is written assuming that the texture coordinates are not normalized (0…1) but instead represent the actual pixel coordinates (0…width or 0…height). OF is set up to use non power of two textures by default and also send non normalized texture coordinates, which is why it was written this way.

If you want to use normalized texture coordinates, you are correct that you would need to scale the x and y values from the loop to also be normalized, if your texture is 512x512, then your code would work perfectly. You could also send the texture dimensions in as a uniform vec2 and divide by that if you want to support various texture sizes.

Thanks for the thorough explanation, all is clear. And thanks for sharing the shader!