One of my programs isn’t working well with an AMD card, I believe it could be because of the GLSL version I’m using on a shader, which is set to
#version 120. Could it be that this shader version is obsolete in that card? I’m also using
GL_ARB_texture_rectangle as an extension.
Also, I’ve found the following quote somewhere:
Beware of NVIDIA - their GLSL compiler will even accept some HLSL syntax/keywords, and if you only ever develop or test on NVIDIA you stand a real danger of producing something that will only ever work on NVIDIA.
Maybe this is the source of the problem, I develop on Nvidia cards and get no problems. Are there any guidelines for this not to happen? I’d like to upgrade my shaders to a newer version and make them fully compatible with any card.
Could you be a bit more specific in describing “not working”? Are you seeing a compile time, runtime, visual or some other type of error?
Without any real details to work with it is difficult to say what you might be running into, though I have my doubts that the AMD card wouldn’t support GLSL 1.2. If you are using any exotic extensions, you could check to make sure it is available on all the GPUs you are supporting - a really simple way is via OpenGL Extensions Viewer or you could check at runtime in your OF app, see examples/gl/glInfoExample.
Hi, thanks for the reply. The problem is I can’t really test on the AMD machine.
Actually, I still need to dig into the code to see what’s exactly happening. I thought I understood what the problem was but I believe that’s not the case…
I will try to test the glInfoExample on that machine though, that might shed some light. Thanks!