HDR on screen - is it possible?

I’ve just posted my first HDR video to YouTube, and it looks great on my Mac laptop screen - it got me thinking, can I make HDR content in my oF app and display it on screen with that same brightness as the video?

Hey @seb_ly , I haven’t done HDR, but have you tried using an ofFbo with a format of GL_RGBA32F and float-related stuff in OF (ofFloatColor, ofFloatImage, etc)? I think this format allows for values that aren’t clamped between 0.0 - 1.0. Also, this bit from LearnOpenGL might be helpful:

Hi @seb_ly
That video is so cool.
HDR content will always face the same limitation; that is the actual dynamic range of the display, and regardless of which display it is the problem is almost always the same; there are not enough colors, just 8 bits per channel (some displays even go with 6bits per channel). Now what @TimChi comments is wery useful, as the idea there is to work with 32 bits per channel, giving you a much broader range and detail. specially when you do calculations , if something falls out of the 0 to 1 range, it wont get clamped. But in order to show the image, it needs to be downsampled to fit the 8 bits of the display. In HDR this process is called tone mapping, and is where a lot of the HDR magic happens. So you might want to take a look at tone mapping and how to deal with it programatically

3 Likes

I’m not sure of how “lossless” is the process of using colors from scratch to output in OF.
Maybe a wide gamut color profile should be applied for this to work and enable macOS window working with this range.
Some shader can be used to convert from linear rgb to other color space
and GLFW has 8 bits per channel as default too, not sure if this have to be changed.

related: HDR10 in openFrameworks

and there is a known issue where ofFloatColor is being downsampled to 8bit.

There are plenty of devices working with 10bits today, like some DELL displays, I think Macbook Pro is working in 10 bits too, and LED panels: it makes a huge difference in virtual production, banding issues etc.

1 Like

How about saving an HDR image from an ofFbo? Maybe the ofFloatColor is an issue for ofFloatPixels? I looked at FreeImage and it seems to support HDR stuff.

Hi folks,

As far as I know, since OF uses GLFW, it does not support HDR windows. Some provided a solution for Windows 10 but I never tested it.

For you to be able to save to 32bits, you need to use float buffers and save to TIFF. If I recall correctly every other format converts to 8bits.

2 Likes

(chiming in as a recent project made me go into that rabbit hole, and the problem is not to process in 10bits or more, but to organize the necessary information pipeline in the GPU output or OS window).

current MacBook pros are 8-bit displays but articulate things like temporal dithering (aka FRC) and local brightness which gain from a 10-bit stream, and also from dynamic HDR metadata (HDMI 2.1) which can adjust the tonemap within the display as needed (e.g. the dark areas can be processed differently).

it’s hard to find specs as this is secret sauce territory, in case of Apple, only the Pro Display XDR - Technical Specifications mentions 576 local zones (this is in addition to being the only unambiguously-advertised 10-bit Apple display), while About the Liquid Retina XDR display on iPad Pro - Apple Support boasts “over 2,500” local zones.).

so each of these local zones are 8-bits, but can have their own grading and physical LED brightness and dither settings. managing that (and making adjacent zones blend tastefully) is the display’s problem. (or in the case of an integrated hardware/software manufacturer like Apple, perhaps the OS is involved too).

letting the 2 extra bits roam out onto the HDMI cable is a mixture of software, OS and GPU/driver. you need to do as referenced in hubris’ link. then to apply metadata you have to call things like NvAPI_Disp_HdrColorControl and NvAPI_Disp_SethdrColorData (Nvidia proprietary). Displaying HDR Nuts and Bolts | NVIDIA Developer . it’s hard to find deeper docs; perhaps it’s protected IP reserved to gaming engines and the like. in all cases it’s platform-specific (OS * GPU * driver).

if you’re not sure if your display is 10-bits, then it’s not.
(you’d remember paying for a true 10-bits display).

3 Likes