ofImage img;
ofPixels pix;
img.load("rainbow50x50.jpg");
img.getTexture().readToPixels(pix);
for (int i = 0; i < pix.size(); i++) {
cout << i << " color = " << pix.getColor(i) << endl;
}
Why is the ofPixels object so much larger (pix.size() = 7500) than the w*h of the image (50x50)?
When I call getColor(size_t index) and print all colors it appears the pixels are reordered compared to how they are rendered on-screen… for example pix.getColor(0) returns red but then pix.getColor(1) returns the color blue (0,0,254) when the first 10 rows of my image are clearly red, so I’d expect the first 500 indices to return red… how are pixels ordered in an ofPixels object?
Hi,
1.) The documentation states: This gives you the number of values that the ofPixels object contains. So it treats a RGB pixel as three values. Total: 50 * 50 * 3 = 7500.
So the number of pixels would be pix.size() / pix.getNumChannels()
2.) I can’t answer your question, since I don’t know how images / ofPixels order their data.
But I usually go this way (which prints the first row as red, etc):
Thanks @Jildert, not sure why I hadn’t checked the docs for ofPixels::size() previously . I guess it’s not super critical to know how ofPixels orders its data if one can just use an xy pos to getColor() and setColor().
Although you can use getColor and setColor it is a lot faster if you maniputalte values directly. ofImage data is ordered by rows from top to bottom. What might change is the ordering of the color values, which depends on several different things.