fileOpenSaveDialog example

Hi there, I’m running this example that perform pixels sorting per line with brightness as a comparative value on an image.
While looking at the file build after the sorting it appears looking at each pixels that it is not totally sorted. I ran this example with a sorting function that use getLighness instead of getBrightness as this method probably use diffferent method for caclulating something probably very perceptual.
But looking closely at the pixels with photoshop for instance it appears that the sorting is not really efficient.
Or am I missing something ?
It goes the same if I sort with saturation as a comparative value.
I work on a Macbook pro with Xcode 11.2.1 & OF 0.11


I can only confirm that I’m having the same results on color images (oF 0.10.1, Xcode 9.4.1, Mac OS 10.13.3).
Grayscale goes well however.
Converting the color-image to grayscale after the pixel sorting stays messy.
Converting the color-image to grayscale before the pixel sorting makes sense.
The results from this repo seem to make more sense?

Edit: ah, the sorting doesn’t go per row in the example there.
Algorithm there is:

bool ofApp::sortingFunction(ofColor x, ofColor y){
  return ((x.a << 24) + (x.r << 16) + (x.g << 8) + x.b) < ((y.a << 24) + (y.r << 16) + (y.g << 8) + y.b);

Have you checked out Color.h:286? It gives some context about the functions.
getSaturation() looks nice.

Thanks for replying.
Yes, I did some research on the way to calculate a value of luminosity from rgb values.
So getBrightness() take the higher value fom r, g & b value. This is the brightness from H.S.B. color space.
getLightness() is the mean of the 3 vallues r, g & b. The lightness from H.S.L. space.
In L.a.b. space we apply coefficient to each r, g & b values to “fit” to the physiological response of human eyes. I ended up using this formula for the Luminance :
L = 0.299*R + 0.587*G + 0.114*B
Sorting among this luminance was the most satisfying to my eye.