Pixels to save image works but cannot load data to texture

Hi, I posted this on another very old thread (here) but have deleted that and moved it here as it did not bump to the top of the forum.

Hi, I used code from arturo’s example here as a base to store frames in a vector of ofPixels and then retrieve them for instant playback. The storing works fine but when I retrieve the pixels I cannot load the pixels into a texture, but I can use them with the ofSaveImage() function from the same place in the code. I am guessing it is something to do with the pixel format, but when I call get pixel format I get RGB back.

Here is the code to pass the pixels to the thread (it is the same as the code above in this thread.

if(record){
		if(!firstFrame){
			// wait for the thread to finish saving the
			// previous frame and then unmap it
			saverThread.waitReady();
			pixelBufferBack.unmap();
		}
		firstFrame = false;

		// copy the fbo texture to a buffer
		fbo.getTexture().copyTo(pixelBufferBack);

		// bind and map the buffer as PIXEL_UNPACK so it can be
		// accessed from a different thread  from the cpu
		// and send the memory address to the saver thread
		pixelBufferFront.bind(GL_PIXEL_UNPACK_BUFFER);
		unsigned char * p = pixelBufferFront.map<unsigned char>(GL_READ_ONLY);
		saverThread.save(p);

		// swap the front and back buffer so we are always
		// copying the texture to one buffer and reading
		// back from another to avoid stalls
		swap(pixelBufferBack,pixelBufferFront);
	}

Here is the code of the imageSaver Class .cpp threaded function

    void ImageSaverThread::threadedFunction(){
        // wait to receive some pixels,
        // save them as jpeg and then tell the main
        // thread that we are done
        // if the channel closes go out of the thread
        unsigned char * p;
        
        while(channel.receive(p)){
            pixels.setFromExternalPixels(p,960,540,OF_PIXELS_RGB);
            lock();
            frames.push_back(pixels);
            ofLogVerbose()<<"Frame number " + ofToString(frames.size()) + " added"<<endl;
            unlock();
            channelReady.send(true);
        }
    }`

Here is the code of the playback function

if(play) {
    
//this works
    ofSaveImage(saverThread.frames[frameCounter], ofGetTimestampString() + ".jpg");
    
//this does not throw any errors
    drawTexture.loadData(saverThread.frames[frameCounter], width, height, GL_RGB);
    ofLogVerbose()<<"Frame number " + ofToString(frameCounter) + " played of " + ofToString(saverThread.frames.size())<<endl;
    
//This shows a white screen
    drawTexture.draw(0,0);
    
    frameCounter++;
    if (frameCounter>50-1) {
        frameCounter=0;
    }
}

Is there something I need to do to the ofPixels object when I get it from the vector and load the data to the texture to make it draw properly?

I am on an updated master branch of OF.

Cheers

Fred

Hi Fred,

ofTexture::loadData(...) takes either

(const unsigned char *const data, int w, int h, int glFormat)
or
(const ofPixels &pix)

according to http://openframeworks.cc/documentation/gl/ofTexture/#show_loadData

What if you pass only the ofPixels without other arguments i.e.,
drawTexture.loadData(saverThread.frames[frameCounter]);

Also make sure width/height values are valid.

Yes, I have tried with and without the arguments and also tried ofImage.setFromPixels. It seems like there is some kind of error with the pixels object once it is in the vector, an error that something in the ofSaveImage(ofPixels pixels) can rectify.

Very long shot: I had a similar issue with Intel HD Graphics and PBOs. Turns out I had to use:

glBindBufferARB

Instead of

glBindBuffer

Which might be used by your pixelBufferFront object.

I had this problem (blank textures) with ofxFastFboReader, ended up reimplementing PBO transfers.

Note that this is a hack, not recommended at all and probably I shouldn’t have done it. ARB are extension funcitons and this one is deprecated… but in my case just seemed to work with Intel HD Graphics for some reason.

Cheers, I gave it a try but this did not solve anything. I do notice if I put the pixels into a texture it shows white OR random pieces of GPU memory, if use an ofImage and setFromPixels() I get a black image…

Well for ofImage at least you need to call ofImage::update after setFromPixels in order to upload to GPU. If you can upload your ofApp to give it a try.

Instead of using GL_RGB directly, it’s safer to use ofGetGlInternalFormat as the last parameter of ofTexture::loadData.

++
You can just ofTexture::loadData(ofPixels&)… much simpler and does that internally.

Good luck!

have you allocated drawTexture first? ofTexture has some kind of autoallocate in loadData but it might not work in some cases so it’s always better to call allocate first.

also i’ve seen a problem once where the buffer object would remain bound and then loadData will try to read from there instead of the pixels i think i solved that but not sure if it’s in the last official release, perhaps try using the nightly builds

apart from that you are downloading a texture to pixels to then upload it again into a texture, you can just directly loadData from the buffer into another texture to make a copy of a texture if that’s what you want and only pull the pixels if you really want to record them which should be much faster

@chuckleplant thanks again for the tip, I had already tried a lot of variations, including calling

ofImage::update
After setting from pixels

and with a texture, using any of the following have the same result ( where saverThread.frames is a vector of ofPixels.

drawTexture.loadData(saverThread.frames[frameCounter], width, height, ofGetGlInternalFormat(saverThread.frames[frameCounter]));

drawTexture.loadData(saverThread.frames[frameCounter], width, height, GL_RGB);

drawTexture.loadData(saverThread.frames[frameCounter]);

drawTexture.loadData(saverThread.frames[frameCounter].getData(), width, height, ofGetGlInternalFormat(saverThread.frames[frameCounter]));

drawTexture.loadData(saverThread.frames[frameCounter].getData(), width, height, GL_RGB);

No joy.

@arturo I do allocate the texture first, and I also tried the auto allocation - was trying to keep the post small so i did not post all the code.

I am not so sure about what you mean in the last paragraph, I want to keep a few hundred frames in memory and the machine I have to use only has an onboard GPU and slow (i5) CPU and 8 gig of ram. I thought I should keep this data in ram, not on the GPU, hence storing the array of ofPixels and then uploading the frames one at a time as I want to draw. I need to have instant playback of “recorded video” and don’t need to save it, just play it back once without delay. If I do the same operation without the pixelBuffers or the thread it all works just not quite fast enough, hence the buffer and threading.

If I understand what you are saying instead of a vector of ofPixels I should use a vector of ofBuffers and load the data to and from that directly?

Ill try a nightly anyway and see if it makes a difference.

Cheers

Fred

just keep the data in ofTextures then, the card driver will download/upload things from RAM to the GPU and viceversa whenever it needs more meory