GL_ARB_texture_float on Intel Iris and Nvidia

Hi folks!

Got a problem here with some compute shaders. If I understood correctly, the issue relies on GL_ARB_texture_float extension.

On the CPU, I’m creating the textures with the internal format GL_RGBA16F.
On the compute shader, the layouts are set as rgba16.

And yet, I got nothing.

This works perfectly on two other PCs I have with AMD GPUs.
Or, if I switch to RGB8, I can see some stuff but it doesn’t have the precision I need.

I’m running on the maximum supported openGL version, which because of the Intel Iris, is 4.4.
I also have the Intel Iris drivers up-to-date, and according to them the extension is supported.

When I run the glInfoExample on the Intel Iris, it says that GL_ARB_texture_float is on the available extensions. But when I run on my app, or try to enable it on the shader, it’s not available.

(Edit: Also, when I do ofGLCheckExtension(“GL_ARB_texture_float”) it returns false.)

(Edit 2: Forgot to mention that the computeShaderTextureExample runs fine on the Intel Iris, but that just uses GL_RGBA8 and GL_R8.)

Edit 3: Same happens on Nvidia. It seems that when it’s running on openGL 2.1 the extension is enabled, but not on 3.2+.

So it seems that the change for the Programmable Renderer is disabling the extension on Intel and Nvidia. Any suggestions for a new step?


Hi @hubris,

In the compute shader, are the layouts rgba16, or rgba16f? I was looking into a similar topic today (related to an intel gpu and framebuffer float types) and remembered your post. I’m thinking the layout format (or type) needs to match texture format, and rgba16 would be a different type (unsigned normalized integer) than the floats in the GL_RGBA16F framebuffer (I think).