Gamma color correction

I was recently reading this article and I was trying to apply gamma color correction to my sketch.
After reading this article:

I have simply add this line glEnable(GL_FRAMEBUFFER_SRGB); at the beginning of my setup method and I have change this line in the fragment shader from

vec3 texColor = texture(tex0, vTexCoord).xyz;


vec3 texColor =  pow(texture(tex0, vTexCoord).rgb, vec3(2.2));    

This is the image without gamma color correction:

And this is the image with gamma color correction:

I prefer that one with gamma correction because the image looks brighter and the contrast are more visible but I am wondering if I am doing it correctly or not, or if there is another way to do this with openFrameworks. I would like to avoid to call glEnable(GL_FRAMEBUFFER_SRGB); .

How are you doing gamma color correction in OF?

1 Like

One more tutorial that links to your first one:

1 Like

happy new year! :smiley:
@edapx thanks for sharing that article. super interesting.
After reading the article I made some tests that you can find here. I think that this is not a trivial nor unimportant issue, which needs to be addressed properly.
From those tests I made, the following images illustrate quite well the problem, in which a simple gaussian blur is applied to a test pattern.
The following is the test pattern without bluring.

We would expect the following if we applied a gaussian blur, and it is what is achieved when properly handling gamma.

But instead what we currently get is the following (notice the dark banding between the color regions.

As I said I think that this is something important, considering the nature of openFrameworks and its use. I would really like what you guys think @arturo @zach @theo @elliotwoods @kylemcdonald @bakercp

all the best!

1 Like

the most common way to do gamma correction is to translate textures colors to linear in a shader, the operation is the opposite you posted, pow(tex.rgb,1./2.2) if i remember well and then do all operations, like lighting, in the shader using colors in linear space

after that you apply the gamma correction doing pow(tex, 2.2)

this is usually only for colors from textures which are converted to srgb when saved to files which usually happens for any image saved as 8bits. images with 16 or 32bits are usually already linear

also instead of the last step to bring back the colors to srgb you can do tonemapping which kind of simulates how film reacts to light so you can get different effects. there’s lots of articles about tonemapping with glsl code, the most used algorithms are uncharted 2 (from the game in which it was used for the first time) and nowadays ACES for the film industry standard


Thanks for replying.

The way the operations are done is correct in what I posted, otherwise you dont get the correct result and it all just gets even worse.
In this test I did it all in the shader

It also happens when drawing into an fbo and then apply the blur over, it is not exclusive to rendering image files that were loaded and had an srgb profile.

Thanks for pointing out tonemapping. I was already aware of it but I think that it falls into a completely different thing. I am worried about the fact that handling gamma incorrectly can lead into a lot of problems, and incorrect results, such as a simple thing as a gaussian blur.
So far, from these tests I made, calling glEnable(GL_FRAMEBUFFER_SRGB); seems to be the most straight forwards way to deal with it but as an openGL function it has its days counted on macos and it wouldnt be an overall future proof solution.

I was answering edapx’s original post.

in any case I don’t understand very well what’s the process you’ve followed for each of the images you’ve posted,

I’m not sure how the images are related to this, it seems more of a problem with whatever blur implementation you are using?

Also as i said you usually apply srgb -> linear -> calculations -> srgb when working with 8bit image files, which are in srgb but you want linear to work with the color information. If you generate colors using rgb values they’re not in the srgb or linear space since you are just choosing random values

lol. sorry.

I took that pattern and use the gaussian blur (or any blur should have a similar effect) from this article.

Applying a blur to a test pattern with big color jumps will produce this banding effect as a consequence of unhandled gamma. It could be any other thing but with this test is is a lot more noticeable.

Right. But the problem is that it is not a very wide spread information and from the openFrameworks perspective we are not providing any tool for dealing with the srgb to linear tranasformation and subsequently a lot of the image processing is being done wrong. For instance just take a look at the examples/shader/09_gaussianBlurFilter, and you will see this darker areas appear when some colors get blurred. A gaussian blur should look like if you were looking at an image through some translucent and diffusing screen, where evidently these dark banding does not appear.

Yes but it depends on the rendering engine processes these values and where does the linear to srgb transformation happens. Also, I am not loading an image file for these tests, I am rendering the test pattern by drawing the colored rectangles, hence it has nothing to do with image file encoding/decoding.

BTW, even Adobe Photoshop gets this wrong, if you try to apply a blur to the test patter you’ll get this same dark banding issue. If you change the image mode to 32 bit, which seems to change to linear, it applies the blur correctly but it has nothing to do with bit depth it is just about not handling correctly the srgb to linear transform when using 8 bit images. In the newest releases of photoshop, adobe fixed this issue, but the kept the wrong algorithm and labeled it as “legacy”.

1 Like

i don’t see how the OF core would provide tools for this being somehting that happens usually on a shader. we could have an additional example that shows how to gamma correct the blur to avoid this artifacts and people who do postprocessing addons should take it into account but i don’t think there’s a generic way to solve this from the core in every case

the only place where we should take this into account is in the material shaders we use in ofMaterial when using the programmable renderer where right now we are calculating lights without transforming textures to linear but that would require an extra step to then reconvert to srgb which usually is a tonemap not a plain linear -> srgb so i’m not sure it makes a lot of sense to do it without having a full postprocessing pipeline in place

I think that starting by providing appropiate examples is a good idea. I dont know either how to provide tools for such except for a wrapper to glEnable(GL_FRAMEBUFFER_SRGB); which I am not sure if it is worth. Maybe it is as the whole rendering pipeline has been abstracted from openGL.

Well it might be useful to start with a correctly lit scene, and then leave it to the user to either just do the linear to srgb conversion, or leave in linear and save to some kind of file and allow to make the tonemapping on an external software.