Best way to use integer lookup table in shader

#1

I am using a large-ish lookup table of integers (4k values, 16 bits each) in a shader. My current approach is to use an integer texture (GL_R16I) and read from it using texelFetch() in the shader. This works, but required a nasty hack in openframeworks (making ofGetGLFormatFromInternal return GL_RED_INTEGER for GL_R16I so that I don’t get normalized values on fetch). Is there a better approach?

#2

This might be just that the return value for that constant must be what you are pointing out. Can you open an issue on github explaining in a little bit more detail your modification and why it works? We might just need to include it in OF somehoe

#3

Hmm, some reading of the OGL docs left me a bit unsure. Seems like when internalFormat is GL_(.*)I or GL_(.*)UI then format should be GL_\1_INTEGER on calls to glTexSubImage2D.

This is from my understanding of [https://www.khronos.org/opengl/wiki/GLAPI/glTexImage2D] and [https://www.khronos.org/opengl/wiki/Image_Format]. Seems like it’s a bug in ofx probably. I can do a PR in the next few days.