Doing color correction on video in real rime (on a Raspi)

Hello, first post! Getting more interested in raspberry pi and using Openframeworks in general so hopefully not my last.

I have a pretty good video looping program working on the Raspberry pi 3B+ using the ofOMXPlayer addon. I would like to be able to do some simple color correction on the video that is looping in realtime (rather than doing it in a video editor before loading.) I just want to be able to change the video gain for the R, G and B channels with some controls in the UI. I have been doing a lot of work trying different options for this and all of them have run into some hardware limitation of the PI’s GPU.

First I tried running a fragment shader but even the examples in the /examples/shaders/ folder give me a bunch of errors in the line of “sorry, it looks like you can’t run 'ARB_shader_objects”.

Next I tried to run the output of omxPlayer into an ofPixels object using omxplayer.getPixels so that I can run the color correction on the CPU. However, apparently some features of ofPixels (namely reading from an ofTexture to an ofPixels object using ofTexture.readToPixels) does not work on GLES, which is what the pi uses.

Finally I tried to read the framebuffer results from omxPlayer.getpixels directly as an “unsigned char*” type and while I was able to successfully compile this program when I ran it I only saw about a split second of video and then the program crashes, oddly with no error messages.

I am just wondering if someone can point me in the right direction for a color correction method that will actually run on the Raspberry pi’s GPU. All of the methods that I have come across on the forum clearly expect that you are running open frameworks on a computer with a more robust GPU running a more capable version of OpenGL and I just don’t know which methods to avoid given my circumstances. I can post the most recent version of my code but it will need to be cleaned up a bit and also copied over from my raspberry pi since I am posting this from my laptop (web browsing on the pi takes forever even on the pi 3B+).

Anyways, thank you for your help.

it’s an advanced path but OpenMax on the RPi does have some native capabilities to to change color elements

the RPi/OpenMax also has recording capabilities so in theory you could play a video/make color changes and record to a new file. some of that is here

Hey! Fragment shaders should be doable, you need to set it up as an GLES project, using

 ofGLESWindowSettings settings;
 settings.glesVersion = 2;

in your main.cpp file. Also calling ofDisableARBTex() is required and you need to use texture2D instead of texture in the vert and frag files.

You should take a look at this thread.

ofDisableARBTex() doesn’t have any effect in GLES since rectangular textures only exist in desktop open GL