Filter Shader not working :(


I didn’t know where to open this topic, so…well… the question is so strange, I’ve been doing this for two days and is not working is something simple.
I’m using a shader to do a Laplacian filter. The general code is like this:

#version 120

uniform float offsetX;
uniform float offsetY;

varying vec2 Pxy;
uniform sampler2DRect sImage1;


void main(){

  vec2 tc0 = Pxy + vec2(-offsetX, -offsetY);
  vec2 tc1 = Pxy + vec2(     0.0, -offsetY);
  vec2 tc2 = Pxy + vec2(+offsetX, -offsetY);
  vec2 tc3 = Pxy + vec2(-offsetX,      0.0);
  vec2 tc4 = Pxy + vec2(     0.0,      0.0);
  vec2 tc5 = Pxy + vec2(+offsetX,      0.0);
  vec2 tc6 = Pxy + vec2(-offsetX, +offsetY);
  vec2 tc7 = Pxy + vec2(     0.0, +offsetY);
  vec2 tc8 = Pxy + vec2(+offsetX, +offsetY);
  vec3 col0 = texture2DRect(sImage1, tc0).rgb;
  vec3 col1 = texture2DRect(sImage1, tc1).rgb;
  vec3 col2 = texture2DRect(sImage1, tc2).rgb;
  vec3 col3 = texture2DRect(sImage1, tc3).rgb;
  vec3 col4 = texture2DRect(sImage1, tc4).rgb;
  vec3 col5 = texture2DRect(sImage1, tc5).rgb;
  vec3 col6 = texture2DRect(sImage1, tc6).rgb;
  vec3 col7 = texture2DRect(sImage1, tc7).rgb;
  vec3 col8 = texture2DRect(sImage1, tc8).rgb;

  float k0 = -1.0;
  float k1 = -1.0;
  float k2 = -1.0;

  float k3 = -1.0;
  float k4 = 8.0;
  float k5 = -1.0;

  float k6 = -1.0;
  float k7 = -1.0;
  float k8 = -1.0;

  vec3 sum = k0*col0 + k1*col1 + k2*col2 + k3*col3 + k4*col4 + k5*col5 + k6*col6 + k7*col7 + k8*col8;
   gl_FragColor = vec4( sum, 1.0);

This is basically based on the Processing tutorial of shaders. I only get a black image/frames. I’m not sure what is happening with this shader… :frowning:

Hey @sgbzona - can you post your vertex shader and ofApp code?

Hi, @mikewesthad , This is the Vertex Shader:

#version 120
varying vec2 Pxy;
void main(){
	Pxy =;
	gl_Position = ftransform();

For the Calling of the shader I do:


void testApp::doShaderLaplace(ofTexture &PI1, float newCamH, float newCamW){
	shaderLap.setUniform1f("offsetX", newCamW);
	shaderLap.setUniform1f("offsetY", newCamH);
	shaderLap.setUniformTexture("sImage1", PI1, 1); // texture
	outPutImage.draw(0, 0);

and then on the draw(), vidG1 is the texture coming from the camera:

float ch = 1.0/(float)camHeight;
float cw = 1.0/(float)camWidth;

doShaderLaplace(vidG1, ch, cw);

@sgbzona, so when I tried to run your shader, I ran into a bunch of warnings and errors. I’d suggest taking a look at the openFrameworks shaders tutorial if you haven’t already.

Make sure your main looks something like this:

#include "ofMain.h"
#include "ofApp.h"
// import the fancy new renderer
#include "ofGlProgrammableRenderer.h"
int main( ){     
    // say that we're going to *use* the fancy new renderer
    ofSetupOpenGL(1024,768, OF_WINDOW);         // <-------- setup the GL context

Since openFrameworks uses openGL 3.2, I like to the corresponding GLSL version, which is 1.5 (#version 150). Some of the things you use were deprecated by GLSL 1.5, so these would be the updated versions. I added some comments to note what is deprecated and what variables are provided by openFrameworks. Try making this changes incrementally in order to isolate exactly what was causing your errors. Hope this helps.

Vertex shader:

#version 150

uniform mat4 modelViewProjectionMatrix; // This is provide by openFrameworks
in vec2 texcoord; // This is provide by openFrameworks
in vec4 position; // varying is deprecated
out vec2 vertexTexCoord;

void main(){
    vertexTexCoord = texcoord;
    gl_Position = modelViewProjectionMatrix * position; // ftransform() is deprecated

Fragment Shader:

#version 150

uniform float offsetX;
uniform float offsetY;

in vec2 vertexTexCoord;

uniform sampler2DRect tex0; // Provided by openFrameworks when we bind the first texture

out vec4 outputColor;


void main(){

  vec2 tc0 = vertexTexCoord + vec2(-offsetX, -offsetY);
  vec2 tc1 = vertexTexCoord + vec2(     0.0, -offsetY);
  vec2 tc2 = vertexTexCoord + vec2(+offsetX, -offsetY);
  vec2 tc3 = vertexTexCoord + vec2(-offsetX,      0.0);
  vec2 tc4 = vertexTexCoord + vec2(     0.0,      0.0);
  vec2 tc5 = vertexTexCoord + vec2(+offsetX,      0.0);
  vec2 tc6 = vertexTexCoord + vec2(-offsetX, +offsetY);
  vec2 tc7 = vertexTexCoord + vec2(     0.0, +offsetY);
  vec2 tc8 = vertexTexCoord + vec2(+offsetX, +offsetY);

  vec3 col0 = texture(tex0, tc0).rgb; // Used texture instead of texture2DRect
  vec3 col1 = texture(tex0, tc1).rgb;
  vec3 col2 = texture(tex0, tc2).rgb;
  vec3 col3 = texture(tex0, tc3).rgb;
  vec3 col4 = texture(tex0, tc4).rgb;
  vec3 col5 = texture(tex0, tc5).rgb;
  vec3 col6 = texture(tex0, tc6).rgb;
  vec3 col7 = texture(tex0, tc7).rgb;
  vec3 col8 = texture(tex0, tc8).rgb;

  float k0 = -1.0;
  float k1 = -1.0;
  float k2 = -1.0;

  float k3 = -1.0;
  float k4 = 8.0;
  float k5 = -1.0;

  float k6 = -1.0;
  float k7 = -1.0;
  float k8 = -1.0;

  vec3 sum = k0*col0 + k1*col1 + k2*col2 + k3*col3 + k4*col4 + k5*col5 + k6*col6 + k7*col7 + k8*col8;

 outputColor = vec4(sum, 1.0); // gl_FragColor deprecated

Here’s how I was using it in testApp:

ofShader shaderLap;
ofImage img;
ofPlanePrimitive plane;

void testApp::setup(){
    img.resize(ofGetWidth(), ofGetHeight());
    // You don't have to do it this way, but this was 
    // the code I had lying around.  I'm creating a plane
    // and then mapping its texture coordinates to the image
    int planeWidth = img.width;
    int planeHeight = img.height;
    int planeGridSize = 20;
    int planeColumns = planeWidth / planeGridSize;
    int planeRows = planeHeight / planeGridSize;
    plane.set(planeWidth, planeHeight, planeColumns, planeRows, OF_PRIMITIVE_TRIANGLES);

void testApp::draw(){

        shaderLap.setUniform1f("offsetX", ofMap(mouseX, 0, ofGetWidth(), 1, 10));
        shaderLap.setUniform1f("offsetY", ofMap(mouseX, 0, ofGetWidth(), 1, 10));

                ofTranslate(ofGetWidth()/2, ofGetHeight()/2);


Hi, This looks interesting, but Is not working… when I do the includes of the fancy renderer … I got the following error now:

2014-04-13 09:01:08.594 emptyExampleDebug[18935:a0f] invalid pixel format attribute
[ error ] ofAppGLFWWindow: 65544: NSGL: Failed to create OpenGL pixel format
[ error ] ofAppGLFWWindow: couldn't create GLFW window
[ error ] ofAppGLFWWindow: couldn't create window

Ah, okay. So you weren’t using the programmable renderer before. Good to know. You need to use it for your shaders to work.

The error you are getting might suggest that your graphics card doesn’t support openGL 3.2. Can you download the openGl Extensions Viewer, run it and tell me what openGL version it shows your card supports? You’ll get a screen that looks something like this:

Other questions that might be relevant - what platform are you on, is your graphic cards software up-to-date and does your computer have a dual graphics card setup (one integrated card and one dedicated card).

Hello, Well I´m on a old MacBook Pro 2010.

This is what the OpenGL ExtensionsExtesnsions said:

In this moment I can’t update the mac to a newer OS.
I was planning to do this filter in the #version 120… As I see is possible, because I have tested on a WebGL, but it doesn’t work on OF… I’m a bit confuse… :frowning:

Okay, so as far as I understand your setup:

  • You have a single nvidia card in your computer. I asked about this because your issue could have had to due with apple’s switching algorithm for controlling which graphics card to use: dedicated or integrated. The integrated card (at least on windows) usually has a less openGL support than dedicated.
  • You have OSX 10.6.8, which only has support for openGL 2.1. openGL updates on mac are tied to software updates (I didn’t know that before), so you are stuck using openGL 2.1 at the moment.

You are right to be confused. Your computer can run shaders written for openGL 2.1 and lower. But openFrameworks has a series of openGL commands that fail on versions lower than 3.2 (though they run on 3.2 and above). That’s the heart of the problem. You can try digging around in the source to alter those commands in order to get your shader working. Have a look at this thread where I was trying to do just that - though I wasn’t successful.

EDIT: What I probably should have said was that openFrameworks uses GLFW for handling windowing, and those GLFW commands are what are breaking for you (and what broke for me in that thread). You can try to change those GLFW commands to allow you to target openGL 2.0. Still, I don’t know enough about openGL 2.0 vs 3.0 to tell you whether or not you will also have to change some of the openGL commands that openFrameworks uses to set up shaders.

Hello, I have tested this shader on a Windows Machine using the same version #120 as in the original example, and it does not work. :frowning:

I don´t get it, I think there is a problem with OF 0.8.0, just a black image shows up. I made the same with 0f 0.8.1 and now there is another error:

  SeqGrabberModalFilterUPP(OpaqueDialogPtr*, EventRecord const*, short*, long)in openFrameworksDebug.a(ofQtUtils.o)

  SeqGrabberModalFilterUPP(OpaqueDialogPtr*, EventRecord const*, short*, long)in openFrameworksDebug.a(ofQtUtils.o)

I have tried this code on a webGL shader , which works well… Any ideas?

That’s not very much information to go on from my end. Was an error printed out in the console? Were you using the ofGLProgrammableRenderer line?

My guess would be that the windows machine you tested on did not have support for openGL 3.2. Like I said, there isn’t necessarily anything wrong with your shader. It is that openFrameworks commands will fail if you don’t have support for openGL 3.2.

If there is an error in your openFrameworks code, it would be in your doShaderLaplace(...) method. Maybe your texture ID should be 0. Maybe your UV coordinates aren’t set up properly. Take a look at the openGL 2.0 shaders from the openFrameworks tutorials (link to the blur shader code).

Hello @mikewesthad, thanks for your help. :smile: no is working the shader using #version 120. The blur shader gave me some ideas for solve it out.

But… now I have another issue, when I tried to run this on OF 0.8.1 there are these errors:

  "_BeginUpdate", referenced from:
      SeqGrabberModalFilterUPP(OpaqueDialogPtr*, EventRecord const*, short*, long)in openFrameworksDebug.a(ofQtUtils.o)

  "_EndUpdate", referenced from:
      SeqGrabberModalFilterUPP(OpaqueDialogPtr*, EventRecord const*, short*, long)in openFrameworksDebug.a(ofQtUtils.o)
ld: symbol(s) not found

There is a problem with QT, for OSX 10.6.8. I Tried to run the videoGrabber example, and it does not work, the same two errors… any idea?

Just to reiterate again…you are still not going to be able to run the openGL 2.0 shader without modifying the openFrameworks source. Your original code was breaking on GLFW commands. GLFW couldn’t provide you a graphical window for the openFrameworks app, and it exited. (If you aren’t familiar with what GLFW is, check out this FAQ.) You are going to run into the same issue in 0.8.1 (as far as I know).

The windows machine could have failed on GLFW as well, if it didn’t have openGL 3.2 support. If it did have openGL 3.2 support, then it could have also failed because of errors in your openFrameworks code. That’s why I linked the tutorials.

As for your QT error, I would start a thread and give detailed information on when it comes up.

hello @sgbzona, have you been able to solve this? i’m running into the same error, also on 10.6.8.

nevermind. i solved it by adding the Carbon framework to “of_v0.8.1_osx_release//libs/openFrameworksCompiled/project/osx/”


1 Like

hmmm… ok…I did that, but manually I have to add on System Frameworks the: Carbon.framework

now it works… :wink: