ofPixel [error] - image type not supported

Hi there,

i’m having an issue with the allocation of ofPixel class.

texture.allocate(hapCam.getTexture()->getTextureData(), GL_RGBA, GL_UNSIGNED_BYTE);

	if (texture.bAllocated() && !pixels.isAllocated())
	{
		pixels.allocate(texture.getWidth(), texture.getHeight(), OF_IMAGE_COLOR);
	}	

If i start my program with the windows debugger i don’t get any error messages, but if i start it normally without the debugger i get a running message of : “ofPixels: image type not supported”

I don’t know how to fix that but to set the LogLevel to OF_LOG_FATAL_ERROR.

I did find a topic with the same issue from 2016 but there was nothing usefull either.

pixels.allocate third parameter in your code is 3. it is best if you use one of the defines, so it is more readable

OF_IMAGE_GRAYSCALE
OF_IMAGE_COLOR
OF_IMAGE_COLOR_ALPHA

Thanks for your tip. Would be even better if it could resolve my problem :sweat_smile:

Is the error in the lines you posted or further down in code with some readToPixels or something else?

Unfortunatly I can’t tell where exactly this error occurs, because as I said it only does when I execute the code without the debugger active.

In fact i do have the following code that follows:

				if (pixels.isAllocated())
				{
					texture.readToPixels(pixels);
					image.setFromPixels(pixels);
					image.setImageType(OF_IMAGE_GRAYSCALE);
					grayscaleImg = image;
				}

I thought it would have to do something with the code in my first question.
The variable grayscaleImg is from the ofxCvGrayscaleImage class. I use it to do some threshholding and setting ROIs etc.

So thanks to you @dimitre I was able to look for the origin of that error.

If im reading my texture to pixels like so,

texture.readToPixels(pixels);

the function ofTexture::readToPixel() allocates the pixel-object again as follows:

void ofTexture::readToPixels(ofPixels & pixels) const {
#ifndef TARGET_OPENGLES

	pixels.allocate(texData.width, texData.height, ofGetImageTypeFromGLType(texData.glInternalFormat));
	ofSetPixelStoreiAlignment(GL_PACK_ALIGNMENT,pixels.getWidth(),pixels.getBytesPerChannel(),pixels.getNumChannels());
	glBindTexture(texData.textureTarget,texData.textureID);
	glGetTexImage(texData.textureTarget,0,ofGetGLFormat(pixels),GL_UNSIGNED_BYTE, pixels.getData());
	glBindTexture(texData.textureTarget,0);
#endif
}

The outcome of ofGetImageTypeFromGLType(texData.glInternalFormat) in my case is 3, which is the defined constant OF_IMAGE_UNDEFINED.

And when I looked in to ofPixelFormat::ofPixelFormatFromImageType() methode, which is being called when allocating an ofPixel object, there is a switch case with the default case ofLog(OF_LOG_ERROR,“ofPixels: image type not supported”).

static ofPixelFormat ofPixelFormatFromImageType(ofImageType type){
	switch(type){
	case OF_IMAGE_GRAYSCALE:
		return OF_PIXELS_GRAY;
		break;
	case OF_IMAGE_COLOR:
		return OF_PIXELS_RGB;
		break;
	case OF_IMAGE_COLOR_ALPHA:
		return OF_PIXELS_RGBA;
		break;
	default:
		ofLog(OF_LOG_ERROR,"ofPixels: image type not supported");
		return OF_PIXELS_UNKNOWN;
	}
}

So it must have to do something with my texture which im getting from the ofxHapPlayer class.

My solution to this would be to just change the sourcecode of the ofTexture::readToPixels() methode to

void ofTexture::readToPixels(ofPixels & pixels) const {
#ifndef TARGET_OPENGLES
	if (!pixels.isAllocated()) {
		pixels.allocate(texData.width, texData.height, ofGetImageTypeFromGLType(texData.glInternalFormat));
	}
	ofSetPixelStoreiAlignment(GL_PACK_ALIGNMENT,pixels.getWidth(),pixels.getBytesPerChannel(),pixels.getNumChannels());
	glBindTexture(texData.textureTarget,texData.textureID);
	glGetTexImage(texData.textureTarget,0,ofGetGLFormat(pixels),GL_UNSIGNED_BYTE, pixels.getData());
	glBindTexture(texData.textureTarget,0);
#endif
}

Hey this solution will work but I’m wondering if it might be better to leave ofTexture::readToPixels() alone and either change the internal format, or maybe just draw hapCam into an ofFbo and then use that texture to .readToPixels(). Maybe the problem lies in how texture is allocated in this particular case with hapCam:

I don’t really see any issues with ofTexture::readToPixels() as it is, and it could be problematic to assume that an any previously allocated pixels object will work properly with this function.

It helps when you post more complete code. I like the suggestion @TimChi gave you of drawing inside a fbo.
I’m not sure but I suppose ofxHapPlayer will use a shader to correct colors in this internal texture.

and again, we don’t know in your code what grayscaleImg is, and which format it is set.

Hey @TimChi sorry for the late reply but I see what you are saying, and I was assuming the same thing after searching deeper in the allocation of my ofTexture object.

As you said the internalFormat of the texture object is kinda off.

cout << texture.texData.glInternalFormat << endl;

returns: 0x83F0

And after looking in the glew.h header I have no clue off, I think it isn’t supposed to be that particular hex number.

How do I change the internalFormat???

I have never worked with FBOs and I dont know if it would be an act to change that…

As I said grayscaleImg is just an ofxCvGrayscaleImage.

Unfortunately I am not able to post my whole code…

But thats the part in my ofApp.h where the grayscaleImg is defined:

ofApp.h

	ofImage image;
	ofxCvGrayscaleImage grayscaleImg;

	ofImage sheetImg;
	ofxCvGrayscaleImage grayscaleSheetImg;
	ofxCvGrayscaleImage transformedSheetImg;
	ofImage tempTransformed;

ofApp::update ()

	if (connected)
	{
		mosquitto_loop(mosq, 0, 1);
	}

	if (startProgram && !movieDone)
	{
		if (hapCam.isLoaded() && (PUBACK || !connected))
		{
			if (currentFrame == hapCam.getTotalNumFrames())
			{
				std::cout << "Movie is Done. Closing Player..." << endl;
				hapCam.close();
				std::cout << "Player closed." << endl;
				movieDone = true;
			}

			// Neues Frame laden
			if (hapCam.isFrameNew() && !movieDone)
			{
				texture.allocate(hapCam.getTexture()->getTextureData(), GL_RGB, GL_UNSIGNED_BYTE);

				if (texture.bAllocated() && !pixels.isAllocated())
				{
					pixels.allocate(texture.getWidth(), texture.getHeight(), OF_IMAGE_COLOR);
				}

				if (pixels.isAllocated())
				{
					texture.readToPixels(pixels);
					image.setFromPixels(pixels);
					image.setImageType(OF_IMAGE_GRAYSCALE);
					grayscaleImg = image;
				}

				if (grayscaleImg.bAllocated)
				{
					grayscaleImg.threshold(grayscaleTreshhold);

					sheetContour.findContours(grayscaleImg);
				}
			}

			hapCam.nextFrame();
			currentFrame++;
		}
   .
   .
   .
   .
   }

Till the line where it says grayscaleImg = image I didn’t even allocate it so there was no change to that particular variable.

P.S.: If i allocate texture like so, which would be an option too:

texture.allocate(hapCam.getTexture()->getTextureData());

I get some other errors which says:

[ error ] ofGLUtils: ofGetGLTypeFromInternal(): unknown internal format 33776, returning GL_UNSIGNED_BYTE
[ error ] ofGLUtils: ofGetGLFormatFromInternal(): unknown internal format 33776, returning GL_RGBA

It’s kinda the same issue with the ofxHapPlayer class.
The number 33776 is 0x83F0 in Hex which I talked about in my previous reply

I think I’d try the following to simplify and also maybe solve the issue:

if (hapCam.isFrameNew() && !movieDone)
{
    hapCam.getTexture().readToPixels(image.getPixels());
    image.setImageType(OF_IMAGE_GRAYSCALE);
    grayscaleImg = image;

    // then do the rest of the stuff in ofApp::update()
}

This should work, because (hopefully) the ofPixels in image will be (correctly) allocated again by .readToPixels().

If you don’t need the ofTexture in image, I you can set it to just use its ofPixels with image.setUseTexture(false);.

Also, hapCam.getTexture()->getTextureData() seems different, but I haven’t looked at the addon and maybe it does return a pointer. But normally with ofTexture, the call to .getTexture() returns a reference to the texture, and then you can keep going with .getTextureData() to return the ofTextureData object. So the allocation would like like:

texture.allocate(hapCam.getTexture().getTextureData(), GL_RGB, GL_UNSIGNED_BYTE);

Or if you don’t need the texture data from hapCam, just allocate like this:

texture.allocate(hapCam.getWidth(), hapCam.getHeight(), GL_RGBA);

The ofFbo class is awesome as it lets you render at will. Adding it maybe wouldn’t be too difficult. It would also allow you to add an ofShader to make a grayscale at the same time.

// in ofApp.h
ofFbo fboHap;

// in ofApp::setup()
fboHap.allocate(hapCam.getWidth(), hapCam.getHeight(), GL_RGBA);

// in ofApp::update()
if (startProgram && !movieDone)
{
    fboHap.begin();
    ofClear(ofColor(0));  // optional clear the previous render if you like
    hapCam.draw(0.f, 0.f);
    fboHap.end();

    // now use fboHap in place of hapCam
}

Hi @TimChi,

so the first solution you presented did not work. I only get some texture errors and access violations.

if (hapCam.isFrameNew() && !movieDone)
{
hapCam.getTexture().readToPixels(image.getPixels());
image.setImageType(OF_IMAGE_GRAYSCALE);
grayscaleImg = image;

// then do the rest of the stuff in ofApp::update()

}

And with the FBO I think you might be right. I just never got in touch with that class, so maybe it’ll help me with this particular problem.

Atleast now i know how to use it (never seen that something needs to be drawn in the ofApp::update() function)

Update:

Using Fbo’s was the solution !!! + my program is running faster now.

Thank you @dimitre and @TimChi for your time and effort.

Grazie

1 Like