Does anybody have experience dealing with integer lookups in shaders?
e.g. I want to have a lookup table of 16bit integer values, they should be stored in the shader as 16bit unsigned (as with GL_LUMINANCE16), and when i access them in the shader, it should return an integer value (unlike GL_LUMINANCE16 / sampler2DRect / texture2DRect which return a floating point value).
It seems I should use GL_LUMINANCE16UI_EXT / usampler2Drect / #version 140
But I get a couple of issues there:
- [error] ofGetGlFormatAndType(): glInternalFormat not recognized returning glFormat as glInternalFormat
- how to get texture coordinates in #version 140 ? do i need to set a manual
varying
, or is there a way to recover default texture coordinates?
Thanks!
Elliot