I’m quite confused regarding the behaviour of pointLight. I’m drawing a sphere and I’m applying a simple shader to it.
This is the setup of the scene
sphere.set(100, 100);
ofVec3f lightPos = ofVec3f(130,130,130);
light.setPointLight();
// see light docs http://openframeworks.cc/documentation/gl/ofLight/
// https://forum.openframeworks.cc/t/understanding-oflight/26230/3
light.setup();
light.enable();
light.setPosition(lightPos);
This is my vertex shader:
#version 150
uniform mat4 modelViewProjectionMatrix;
in vec4 position;
in vec4 normal;
out vec4 vPosition;
out vec3 vNormal;
void main() {
vNormal = normal.xyz;
vPosition = modelViewProjectionMatrix * position;
gl_Position = vPosition;
}
and this is my fragment shader:
#version 150
uniform vec3 lightPos;
uniform vec4 materialColor;
in vec4 vPosition;
in vec3 vNormal;
out vec4 vFragColor;
void main() {
vec4 color = vec4(materialColor);
vec3 lightDirection = normalize(lightPos - vPosition.xyz);
float dProd = max(0.3, dot(vNormal, lightDirection));
vec4 colorWithLight = vec4( vec3( dProd ) * vec3( color ), 1.0 );
vFragColor = colorWithLight;
}
The code works but as you see the reflection of the light looks wrong, the brightest part of the sphere is not facing the light
If I change the value of vPosition
being passed to the fragment shader like this:
void main() {
vNormal = normal.xyz;
vPosition = position;
gl_Position = modelViewProjectionMatrix * position;
}
The light looks correct
But now I’m passing the position of the vertext to the fragment shader without applying the model view projection matrix transformation, that feels weird.
I assume that when calculating the light direction:
vec3 lightDirection = normalize(lightPos - vPosition.xyz);
Both light position and vertex position has to be in the same coordinates space, but I do not know which is the way to go.