Lighting with shaders

I’ve made a simple displacement map shader, and now I’m looking to shade it. From what I can tell, I can’t use the typical ofEasyCam and ofLight combination without actually calculating the shading myself.

I would imagine I can at least use the position of the ofLight, as well as the position of the ofEasyCam, and calculate my lighting using these two. This comes with a few new questions for me though:

  1. Should I be writing the lighting calculations in my displacement map shader? Or should I nest them? I’ve never tried nesting shaders so forgive me if that is misinformed.
            shader_displacement.setUniformTexture("displacementMap", video.getTexture(), 1);

  1. How do I retrieve the relevant normals and positions from the camera and from the light?

I’ve been reading this openGL shading tutorial, but between GLM and the differences in built-in uniform variables, I’ve become overwhelmed trying to figure out what is right in oF.

ofEasyCam and ofLight both inherits from ofNode. You can get the position of both with getGlobalPosition.

once you have them, you can set these 2 vectors to your uniform.

shader.setUniform3f("uLightPosition", lightPos);
shader.setUniform3f("uCameraPosition", cameraPos);

then in your shaders you make the calculation. The basic shading tutorial you have linked is the right place to look. OF is already defining some uniforms for you, for example uniform mat4 modelViewProjectionMatrix;

also you can’t anidate shaders at all, there can only be one shader active at any time. if you are using OF from master or the nighlty builds you have the option to “inject” some custom uniforms and code after the the lighting and material have been calculated using the new setup method which is extensively documented in the header:

Thank you, maybe this is my chance to start using OF from master. @arturo, by that did you mean you can’t nest custom shaders? In which case, I will probably just have to build a new function to re-calculate normals, and add that to my one shader, first displacing vertices on the plane, then calculating the updated normals, and then applying lighting to it.

Also, is @edapx, where should I be looking to find some of the uniforms that OF defines? I’ve wondered that for a while, seems like everywhere I use GLSL there are always some pre-defined uniforms but nowhere to find a list.

I do not know if there i a place that lists all the variables that are available to a custom shader. Maybe this is a good place to look but i think @arturo has better suggestions about.

yes there’s not much documentation about that apart from the examples but it’s:

uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
uniform mat4 modelViewProjectionMatrix;
uniform mat4 textureMatrix;
uniform vec4 globalColor;;

I’m still having some misunderstandings with it all. I’ve reduced my goal at this point to just getting an ofSpherePrimitive to display properly using my own shader. For this shader, it’s absolutely bare, just setting the color output as globalColor, and within the vert shader, just multiplying the position by the modelViewProjectionMatrix.

#version 150

uniform mat4 modelViewProjectionMatrix;

in vec4 position;
in vec2 texcoord;

out vec2 texCoordVarying;

void main(){
    texCoordVarying = texcoord;
    vec4 pos = position * modelViewProjectionMatrix;

    gl_Position = pos;
#version 150

uniform vec4 globalColor;

in vec2 texCoordVarying;

out vec4 outputColor;

vec2 uv = texCoordVarying;

void main()
    outputColor = globalColor;

Now, I know a lot of my issues are coming from inexperience handling 3D within glsl, I haven’t run into this many problems when just doing 2D texture effects. Right now, in a bare project, I just have a sphere drawing within the shader.begin(), end(), and it is currently displaying as a plane, the color of my ofSetColor().

Am I misunderstanding the interaction between OF’s 3D objects, like ofSpherePrimitive, and custom shaders? Past this, I feel like I actually am getting a solid understanding of the logic of the lighting, but I’m starting to think maybe my confusion has to do more with openFrameworks and it’s interaction with custom shaders. If I had to make a guess, I think I’m overestimating how much objects in OF can interact with custom shaders. I’m not telling my vertex shader that it needs to create a sphere, I’m just making an inappropriate assumption that it is going to take a sphere as input, because I was always able to make that assumption when it came to 2D textures.

This is basically how I’m thinking right now:

  1. First, I want to try and understand why the ofPrimitiveSphere isn’t simply displaying.
  2. Then, I want to return to my original plan of using a displacement map on an ofPrimitivePlane, which in and of itself works just fine.
  3. Calculate normals.
  4. Calculate and apply lighting.

Matrix multiplication is not commutative. Where you have:

vec4 pos = position * modelViewProjectionMatrix;

Should be:

vec4 pos =  modelViewProjectionMatrix * position;

yeah that’s the problem, this is somehow confusing in OF where the matrix multiplication is reversed compared to glsl. with the next release will migrate OF to use glm a math library with a syntax much more similar to glsl so both OF and glsl will have the same multiplication order

Thank you two, I’ve gotten a lot further in learning through these last few days. Running into a few new questions, I am constantly running into this:

Which from what I can tell, might have more to do with the calculation of the vertex normal, and how I’m multiplying it with the modelViewMatrix:

#version 150

uniform mat4 modelViewProjectionMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 normalMatrix;
uniform mat4 viewMatrix;
uniform vec3 lightPosition_worldspace;

in vec4 position;
in vec2 texcoord;
in vec3 normal;

out vec2 texCoordVarying;
out vec3 Normal_cameraspace;
out vec3 LightDirection_cameraspace;
out mat3 m3;

void main(){
    gl_Position =  modelViewProjectionMatrix * position;
    vec4 Position_worldspace = modelViewMatrix * position;
    Position_worldspace = inverse(viewMatrix) * Position_worldspace;
    vec3 vertexPosition_cameraspace = ( modelViewMatrix * position).xyz;
    vec3 EyeDirection_cameraspace = vec3(0,0,0) - vertexPosition_cameraspace;

//    vec3 LightPosition_cameraspace = ( V * vec4(LightPosition_worldspace,1)).xyz;
    //w = 0 is the only way it seems to be directional
        vec4 l_dir = vec4(.6131,.511,.17,0.0);
        LightDirection_cameraspace = (modelViewMatrix * l_dir).xyz;
    //LightDirection_cameraspace = LightPosition_cameraspace + EyeDirection_cameraspace;

    Normal_cameraspace = ( inverse(transpose(modelViewMatrix)) * vec4(normal,0)).xyz;

Do you have any insight on how I might be approaching this incorrectly? Culling front faces appears to work as a temporary solution, but I still can’t tell if it’s indicative of something incorrect in my vertex shader.

ofxShaderFx has a good working examples of lighting shaders, try taking a look at its classes.

I also started recently a collection of shaders for OpenGL 3.2+ and openFrameworks. It gathers different addons and examples available on the web. Take a look at it and see if it can help:

1 Like

hey @lzmmrman, thanks for the links! very useful.

Do you know if some of this can be used with iOS?

I don’t know, I haven’t used ios OF in a while. If shaders functionality works in ios ( I assume it does ) it should work.