Light using GLSL 150

Hi, i’m studying shader and i have a little problem with light.
I’m trying this code for my shader:

http://www.lighthouse3d.com/tutorials/glsl-core-tutorial/directional-lights/

But i don’t know how to get the normalMatrix. I tried to look inside the ofShader code, but i didn’t find nothing for normalMatrix. How i can get this info?
My shaders are these:

vertex:

#version 150

uniform mat4 modelViewProjectionMatrix;
uniform mat3 m_normal;

in vec4 position;
in vec3 normal;
in vec4 color;
out vec4 colorOut;

vec3 l_dir = vec3(.6131,.511,.17);

void main(){
	mat3 m3 = mat3( modelViewProjectionMatrix[0].xyz, modelViewProjectionMatrix[1].xyz, modelViewProjectionMatrix[2].xyz);
 	vec3 n = normalize(m3 * normal); // normalize(m_normal * normal);
  float intensity = max(dot(n, l_dir), 0.0);
  colorOut = vec4(1,1,0,1) * intensity;
  gl_Position = modelViewProjectionMatrix * position;
}

fragment:

#version 150

out vec4 outputColor;
in  vec4 colorOut;

void main()
{
	outputColor = colorOut;
}

If i try with a sphere, i see a strange effect. Sometimes seems right, sometimes not:

Any idea? Thanks!

well first you have to decide which space you want to do the lighting calculations. I am going to assume it is in camera space. In that case the normal matrix would be the inverse transpose of the modelViewMatrix.
so your shader code should look like this

mat3 m3 = mat3(modelViewMatrix);
m3 = transpose(inverse(m3));
vec3 n = normalize(m3 * normal);
float intensity = max(dot(n, l_dir), 0.0);
colorOut = vec4(1,1,0,1) * intensity;
gl_Position = modelViewProjectionMatrix * position;

Thanks Ahbee. I tried but i still have the same problem and now also my frame rate went down from 59 to 9. :-
About the frame rate… maybe it’s because traspose and normalize calculations are too heavy?
Maybe i could pass the m3 matrix using uniform, but i try to get the modelView matrix like this:

ofMatrixStack matrixStack(*ofGetWindowPtr());
ofMatrix4x4 modelViewMatrix = matrixStack.getModelViewMatrix();

but i don’t know how to convert in ofMatrix3x3. Inside ofMatrix4x4 i have seen this method:

ofVec3f ofMatrix4x4::transform3x3(const ofVec3f& v, const ofMatrix4x4& m)

but anyway, when i get my new ofMatrix3x3 using this method, i don’t know how to traspose and inverse this matrix.
(maybe i’m a little confused about matrix :-))

can I see your shader code. as well as your draw method. Yes you should pass the normal matrix on the cpu side but it should not drop the frame rate by that much. You can upload code by using three ticks’ block.

I clean all the useless code for this test, so, this is my testApp, with the same problems:

#include "testApp.h"

//--------------------------------------------------------------
void testApp::setup(){
  ofSetFrameRate(60);
	ofSetVerticalSync(true);
  ofBackground(0);
  ofEnableAlphaBlending();
  ofSetSmoothLighting(true);
  initSphere();
  shader.load( "shaders/shaderVert.c", "shaders/shaderFrag.c");
}


void testApp::update()
{
}

void testApp::initSphere()
{
  sphere.set(100, 100);
  ofSetSphereResolution(24);
}

//--------------------------------------------------------------
void testApp::draw(){
  cam.begin();
  ofSetColor(255);
  shader.begin();
  ofEnableDepthTest();
  ofEnableLighting();
  sphere.draw();
  ofDisableDepthTest();
  ofDisableLighting();
  shader.end();
  cam.end();
  ofDrawBitmapString(ofToString(ofGetFrameRate()), ofPoint(ofGetWindowWidth() - 70, 20));
}

vertex:

#version 150

uniform mat4 modelViewProjectionMatrix;
uniform mat4 modelViewMatrix;

in vec4 position;
in vec3 normal;
in vec4 color;
out vec4 colorOut;

vec3 l_dir = vec3(.6131,.511,.17);

void main(){
	mat3 m3 = mat3(modelViewMatrix);
	m3 = transpose(inverse(m3));
	vec3 n = normalize(m3 * normal);
	float intensity = max(dot(n, l_dir), 0.0);
	colorOut = vec4(1,1,0,1) * intensity;
	gl_Position = modelViewProjectionMatrix * position;
}

and fragment:

#version 150

out vec4 outputColor;
in  vec4 colorOut;

void main()
{
	outputColor = colorOut;
}

You have alpha blending on either disable it or in your shader say colorOut.w = 1.
and about calculating the normal on the cpu side. I submitted a git hub issue here
. So for now you are going to have to write the extra functions yourself.

oh yeah you dont need ofEnableLighting(); ofDisableLighting();ofSetSmoothLighting(true)
if you are using a custom shader those lines don’t do anything.

Yep, you’re right, i don’t need of options for the light.
Anyway, now i added colorOut.w = 1, passed the normalMatrix in some raugh way and seems to work, here the complete code

void testApp::setup(){
  ofSetFrameRate(60);
	ofSetVerticalSync(true);
  ofBackground(0);
  ofEnableAlphaBlending();
  initSphere();
  shader.load( "shaders/shaderVert.c", "shaders/shaderFrag.c");
}

void testApp::update()
{
  shader.load( "shaders/shaderVert.c", "shaders/shaderFrag.c");
}

void testApp::initSphere()
{
  sphere.set(100, 100);
  ofSetSphereResolution(24);
}

ofMatrix3x3 testApp::mat4ToMat3(ofMatrix4x4 mat4)
{
  return ofMatrix3x3(mat4._mat[0][0], mat4._mat[0][1], mat4._mat[0][2], mat4._mat[1][0], mat4._mat[1][1], mat4._mat[1][2], mat4._mat[2][0], mat4._mat[2][1], mat4._mat[2][2]);
}

//--------------------------------------------------------------
void testApp::draw(){
  cam.begin();
  ofSetColor(255);
  shader.begin();
  
  ofMatrixStack matrixStack(*ofGetWindowPtr());
  ofMatrix4x4 modelViewMatrix = matrixStack.getModelViewMatrix();
  
  ofMatrix3x3 newMatrix3x3 = mat4ToMat3(modelViewMatrix);
  
  newMatrix3x3.invert();
  newMatrix3x3.transpose();

  /**
   * [ a b c ]
   * [ d e f ]
   * [ g h i ]
   */
  
  shader.setUniform3f("firstRow", newMatrix3x3.a, newMatrix3x3.b, newMatrix3x3.c);
  shader.setUniform3f("secondRow", newMatrix3x3.d, newMatrix3x3.e, newMatrix3x3.f);
  shader.setUniform3f("thirdRow", newMatrix3x3.g, newMatrix3x3.h, newMatrix3x3.i);
  
  ofEnableDepthTest();

  sphere.draw();
  ofDisableDepthTest();
  shader.end();
  cam.end();
  ofDrawBitmapString(ofToString(ofGetFrameRate()), ofPoint(ofGetWindowWidth() - 70, 20));
}

vertex:

#version 150

uniform mat4 modelViewProjectionMatrix;
uniform mat4 modelViewMatrix;

in vec4 position;
in vec3 normal;
in vec4 color;
out vec4 colorOut;

uniform vec3 firstRow;
uniform vec3 secondRow;
uniform vec3 thirdRow;

vec4 diffuse;
vec3 l_dir = vec3(.131,.1511,.1617);

void main(){
	mat3 m3;
	m3[0][0] = firstRow.x;
	m3[0][1] = firstRow.y;
	m3[0][2] = firstRow.z;

	m3[1][0] = secondRow.x;
	m3[1][1] = secondRow.y;
	m3[1][2] = secondRow.z;

	m3[2][0] = thirdRow.x;
	m3[2][1] = thirdRow.y;
	m3[2][2] = thirdRow.z;
	vec3 n = normalize(m3 * normal);
	float intensity = max(dot(n, l_dir), 0.0);
	
	diffuse = vec4(.8, .8, .8,0);
	colorOut = vec4(1,0,0,1) * intensity * diffuse;
	colorOut.w = 1;
	gl_Position = modelViewProjectionMatrix * position;
}

fragment:

#version 150

out vec4 outputColor;
in  vec4 colorOut;

void main()
{
	outputColor.w = 1;
	outputColor = colorOut;
}

But i don’t understand why i need to set colorOut.w = 1;

A little off topic: If now i move the vertex using shader, probably the light will be wrong because light are calculated on the original normal of the mesh and so i need to calculate new normal inside the vertex shader?

What you need to do is pass the Camera space light position into the shader. Then calculate the direction. Rember openframeworks matrixStack is Row major meaning if you do this

ofScale(2,3);
ofRotate(45);
ofTranslate(34,2);

the multiplication is pre so the final position is calculated by doing v*T*R*S
But in GLSL the matrices are column major so the final position is calculated by doing S*R*T*v
here is an example where I pass the cameraSpace Light Position to the shader;

#include "testApp.h"

ofShader shader;
ofEasyCam cam;
ofSpherePrimitive sphere;
ofNode light;

//--------------------------------------------------------------
void testApp::setup(){
    ofEnableDepthTest();
    shader.load("shader");
    sphere.set(100, 100);
    sphere.setResolution(24);
    light.setPosition(250,0,0);
}

//--------------------------------------------------------------
void testApp::update(){
    
}

//--------------------------------------------------------------
void testApp::draw(){
    ofBackground(0);
    cam.begin();
    ofDrawAxis(1000);
    shader.begin();
    ofVec3f cameraSpaceLightPos = light.getPosition() * cam.getModelViewMatrix();
    shader.setUniform3f("cameraSpaceLightPos", cameraSpaceLightPos.x, cameraSpaceLightPos.y, cameraSpaceLightPos.z);
    sphere.draw();
    shader.end();
    cam.end();
    

}

here is the vertex shader

#version 150

uniform mat4 modelViewProjectionMatrix;
uniform mat4 modelViewMatrix;
uniform vec3 cameraSpaceLightPos;

in vec4 position;
in vec3 normal;

out vec4 colorOut;

const vec4 diffuseColor = vec4(1,0,1,1);

void main(){
	mat3 normalMat = transpose(inverse(mat3(modelViewMatrix)));
	vec3 vertexNormal = normalize(normalMat * normal);
	vec3 cameraSpaceVertexPos = vec3(modelViewMatrix * position);
	vec3 lightDir = normalize(cameraSpaceLightPos - cameraSpaceVertexPos);
	float intensity = max(dot(vertexNormal,lightDir), 0.0);
	colorOut = diffuseColor * intensity;
	colorOut.w = 1.0;

	gl_Position = modelViewProjectionMatrix * position;
}

Ehy, i tried your solution end finally worked! I continued to work with light but now i do a jump back because i notice a problem in this basic example.
I write here my new code that it’s very similar to your:

#include "testApp.h"

void testApp::setup()
{
  ofSetVerticalSync(true);
  ofSetFrameRate(60);
  ofBackground(10, 10, 10);
  ofEnableDepthTest();
  
  shader.load( "shader.vert", "shader.frag");
  
  plane.set(100, 100, 10, 10);
  cube.set(10, 10, 10);
  sphere.set(10, 100);
  
  sphere.enableNormals();
  cube.enableNormals();
  plane.enableNormals();
  light.setPosition(0, -100, 0);
}

void testApp::update()
{
  updateNormalMatrix();
  shader.load( "shader.vert", "shader.frag");
}

void testApp::draw()
{
  glEnable(GL_DEPTH_TEST);
  cam.begin();
  shader.begin();
  ofVec3f cameraSpaceLightPos = light.getPosition() * cam.getModelViewMatrix();
  shader.setUniform3f("cameraSpaceLightPos", cameraSpaceLightPos.x, cameraSpaceLightPos.y, cameraSpaceLightPos.z);
  shader.setUniform3f("firstRow", normalMatrix.a, normalMatrix.b, normalMatrix.c);
  shader.setUniform3f("secondRow", normalMatrix.d, normalMatrix.e, normalMatrix.f);
  shader.setUniform3f("thirdRow", normalMatrix.g, normalMatrix.h, normalMatrix.i);
  plane.draw();
  ofPushMatrix();
  ofTranslate(0,-20,10);
  sphere.draw();
  ofPopMatrix();
  ofPushMatrix();
  ofTranslate(0,0,10);
  ofRotateZ(ofGetElapsedTimeMicros() * .00004);
  cube.draw();
  ofPopMatrix();
  shader.end();
  light.draw();
  cam.end();
  glDisable(GL_DEPTH_TEST);
  ofDrawBitmapString(ofToString(ofGetFrameRate()), ofPoint(10,10));
}

ofMatrix3x3 testApp::mat4ToMat3(ofMatrix4x4 mat4)
{
  return ofMatrix3x3(mat4._mat[0][0], mat4._mat[0][1], mat4._mat[0][2], mat4._mat[1][0], mat4._mat[1][1], mat4._mat[1][2], mat4._mat[2][0], mat4._mat[2][1], mat4._mat[2][2]);
}

void testApp::updateNormalMatrix()
{
  ofMatrixStack matrixStack(*ofGetWindowPtr());
  ofMatrix4x4 modelViewMatrix = matrixStack.getModelViewMatrix();
  normalMatrix = mat4ToMat3(modelViewMatrix);
  normalMatrix.transpose();
  normalMatrix.invert();
}

vertex shader:

#version 150

uniform mat4 modelViewProjectionMatrix;
uniform mat4 modelViewMatrix;
uniform vec3 cameraSpaceLightPos;
uniform vec3 firstRow;
uniform vec3 secondRow;
uniform vec3 thirdRow;

in vec4 position;
in vec2 texcoord;
in vec3 normal;

const vec4 diffuseColor = vec4(1,0,1,1);
out vec4 colorOut;

void main(){
  mat3 normaMatrix;
  
  normaMatrix[0][0] = firstRow.x;
  normaMatrix[0][1] = firstRow.y;
  normaMatrix[0][2] = firstRow.z;

  normaMatrix[1][0] = secondRow.x;
  normaMatrix[1][1] = secondRow.y;
  normaMatrix[1][2] = secondRow.z;

  normaMatrix[2][0] = thirdRow.x;
  normaMatrix[2][1] = thirdRow.y;
  normaMatrix[2][2] = thirdRow.z;

  vec3 vertexNormal = normalize(normaMatrix * normal);
  vec3 cameraSpaceVertexPos = vec3(modelViewMatrix * position);
  vec3 lightDir = normalize(cameraSpaceLightPos - cameraSpaceVertexPos);
  float intensity = max(dot(vertexNormal,lightDir), 0.0);
  colorOut = diffuseColor * intensity;
  colorOut.w = 1.0;

  gl_Position = modelViewProjectionMatrix * position;
}

Now, my doubts and problems are two.
First, when i launch the app, i can see some black area and i think it’s normal. But i think it’s not normal that if i move the camera, i can see lit area everywhere. I mean… there are not area not reached by light. Isn’t it strange?
Second problem is if i rotate the cube, i see always the same light effects on the face, also if the face it’s in opposite side of the light.
I link here a video:

www.mauroferrario.com/file/simpleLight.mov

from looking at the code.
1.To calculate the normal matrix it should be inverse first then transpose.
2. why are you loading the shader in update?
3. don’t calculate the normal matrix in the update. It should be right before you draw.
4. you have to recalculate the normal matrix before you draw each object.
5. the cube is supposed to be lit based on the algorithm. Take a look at shawdow mapping

Replace your shader with mine. and tell me if you still get a problem

Thanks Ahbee, i answer point to point:

  1. ok, i fixed the problem in my code;
  2. i’m loading in update just because i want to refresh the shader every time i change it. I know is not the best, but it’s just for now;
  3. Ok, i move it in the first line of my draw method;
  4. I use the function that i write below later;
  5. I read this part, but i have to read again and again.

This is the function for normals because i didn’t find any function to update normals in of3dPrimitive:

void testApp::setNormals( ofMesh &mesh )
{
  
	//The number of the vertices
	int nV = mesh.getNumVertices();
	
	//The number of the triangles
	int nT = mesh.getNumIndices() / 3;
  
	vector<ofPoint> norm( nV ); //Array for the normals
  
	//Scan all the triangles. For each triangle add its
	//normal to norm's vectors of triangle's vertices
	for (int t=0; t<nT; t++) {
    
		//Get indices of the triangle t
		int i1 = mesh.getIndex( 3 * t );
		int i2 = mesh.getIndex( 3 * t + 1 );
		int i3 = mesh.getIndex( 3 * t + 2 );
		
		//Get vertices of the triangle
		const ofPoint &v1 = mesh.getVertex( i1 );
		const ofPoint &v2 = mesh.getVertex( i2 );
		const ofPoint &v3 = mesh.getVertex( i3 );
		
		//Compute the triangle's normal
		ofPoint dir = ( (v2 - v1).crossed( v3 - v1 ) ).normalized();
		
		//Accumulate it to norm array for i1, i2, i3
		norm[ i1 ] += dir;
		norm[ i2 ] += dir;
		norm[ i3 ] += dir;
	}
  
	//Normalize the normal's length
	for (int i=0; i<nV; i++) {
		norm[i].normalize();
	}
  
	//Set the normals to mesh
	mesh.clearNormals();
	mesh.addNormals( norm );
}

Now i tried with your code and it works, also if i don’t call setNormals before draw. :-
I remember that when i start this topic, i couldn’t use your code because my frame rate went really slow using inverse and traspose in my shader but now it works normally. But now i tried again on my laptop and the framerate went down to 20. On my laptop i have a “Intel HD Graphics 3000” and on my iMac a “ATI Radeon HD 4850”. Could be this the main problem?
Anyway, using your way, i noticed that my big problem is in my normalMat. Also with the right change you said in point one, it still continue to work in the wrong way, and i don’t have an idea of why.

You have to recalculate the normal matrix every time you draw an object; not the normals themselves

void draw(){
// calculate normal matrix
// upload normal Matrix to shader
// draw plane
// do some transformations
// calculate normal matrix
// upload normal Matrix to shader
// draw sphere

}

I did, but still the same problems:

  updateNormalMatrix();
  glEnable(GL_DEPTH_TEST);
  cam.begin();
  shader.begin();
  
  ofVec3f cameraSpaceLightPos = light.getPosition() * cam.getModelViewMatrix();
  shader.setUniform3f("cameraSpaceLightPos", cameraSpaceLightPos.x, cameraSpaceLightPos.y, cameraSpaceLightPos.z);
  shader.setUniform3f("firstRow", normalMatrix.a, normalMatrix.b, normalMatrix.c);
  shader.setUniform3f("secondRow", normalMatrix.d, normalMatrix.e, normalMatrix.f);
  shader.setUniform3f("thirdRow", normalMatrix.g, normalMatrix.h, normalMatrix.i);
  
  plane.draw();
  
  ofPushMatrix();
  ofTranslate(0,-20,10);
  sphere.draw();
  ofPopMatrix();
  ofPushMatrix();
  ofTranslate(0,0,10);
  ofRotateZ(ofGetElapsedTimeMicros() * .00004);
  updateNormalMatrix();
  shader.setUniform3f("firstRow", normalMatrix.a, normalMatrix.b, normalMatrix.c);
  shader.setUniform3f("secondRow", normalMatrix.d, normalMatrix.e, normalMatrix.f);
  shader.setUniform3f("thirdRow", normalMatrix.g, normalMatrix.h, normalMatrix.i);
  cube.draw();
  ofPopMatrix();
  shader.end();
  light.draw();
  cam.end();
  glDisable(GL_DEPTH_TEST);

You are not updating before the drawing plane, or sphere. It should be something like this.

  glEnable(GL_DEPTH_TEST);
  cam.begin();
  shader.begin();

  ofVec3f cameraSpaceLightPos = light.getPosition() * cam.getModelViewMatrix();
  shader.setUniform3f("cameraSpaceLightPos", cameraSpaceLightPos.x, cameraSpaceLightPos.y, cameraSpaceLightPos.z);
updateNormalMatrix();
  shader.setUniform3f("firstRow", normalMatrix.a, normalMatrix.b, normalMatrix.c);
  shader.setUniform3f("secondRow", normalMatrix.d, normalMatrix.e, normalMatrix.f);
  shader.setUniform3f("thirdRow", normalMatrix.g, normalMatrix.h, normalMatrix.i);
  plane.draw();
  ofPushMatrix();
  ofTranslate(0,-20,10);
  updateNormalMatrix();
  shader.setUniform3f("firstRow", normalMatrix.a, normalMatrix.b, normalMatrix.c);
  shader.setUniform3f("secondRow", normalMatrix.d, normalMatrix.e, normalMatrix.f);
  shader.setUniform3f("thirdRow", normalMatrix.g, normalMatrix.h, normalMatrix.i);
  sphere.draw();
  ofPopMatrix();
  ofPushMatrix();
  ofTranslate(0,0,10);
  ofRotateZ(ofGetElapsedTimeMicros() * .00004);
  updateNormalMatrix();
  shader.setUniform3f("firstRow", normalMatrix.a, normalMatrix.b, normalMatrix.c);
  shader.setUniform3f("secondRow", normalMatrix.d, normalMatrix.e, normalMatrix.f);
  shader.setUniform3f("thirdRow", normalMatrix.g, normalMatrix.h, normalMatrix.i);
  cube.draw();
  ofPopMatrix();
  shader.end();
  light.draw();
  cam.end();
  glDisable(GL_DEPTH_TEST);

Argh! Nothing, it’s the same also if i update in every step

i ran your code, your are not getting the modelViewMatrix properly. Put an ofLog in this function

void testApp::updateNormalMatrix()
{
  ofMatrixStack matrixStack(*ofGetWindowPtr());
  ofMatrix4x4 modelViewMatrix = matrixStack.getModelViewMatrix();
  ofLog() << modelViewMatrix;
}

you see that it is always the identity matrix no matter what transformation you do

I think the solution for the right modelViewMatrix is this:

I’m trying but i still have problems. Now my updateNormalMatrix is this:

void testApp::updateNormalMatrix()
{
  //ofMatrixStack matrixStack(*ofGetWindowPtr());
  //ofMatrix4x4 modelViewMatrix = matrixStack.getModelViewMatrix();
  
  
  GLfloat matrix[16];
  glGetFloatv (GL_MODELVIEW_MATRIX, matrix);
  ofMatrix4x4 m(matrix);
  ofLog() << m;
  normalMatrix = mat4ToMat3(m);
  normalMatrix.invert();
  normalMatrix.transpose();
}

I continue to look for some solution.

That wont work because you are using the programmable render.
If you update to the latest version on github there is a function that returns the model_view_matrix;
you say ofGetCurrentMatrix(OF_MATRIX_MODELVIEW);

I looked inside ofGLProgrammableRenderer.cpp and inside method uploadMatriced i seen that modelViewMatrix is passed in the same way you suggested:

currentShader->setUniformMatrix4f(MODELVIEW_MATRIX_UNIFORM, matrixStack.getModelViewMatrix());

So, i don’t understand why if i pass the matrix in my way, it doesn’t work. :-
I tried to set matrixStack public in ofGetGLProgrammableRenderer class and use inside my updateNormalMatrix:

  normalMatrix = mat4ToMat3(ofGetGLProgrammableRenderer()->matrixStack.getModelViewMatrix());
  normalMatrix.invert();
  normalMatrix.transpose();

and it works. So… i think this could be a solution also if probably not the most correct.

I have seen just now your answer. Yes, it’s the same thing i did, but in the right way!!
Thanks a lot Ahbee for your help and your extreme patience!!