glDrawPixels instead of ofImage

I need to draw an ofImage texture (inside ofApp::draw())
by only using open gl code (skip the gl renderer)

glGenTextures(1, &tex.getTextureData().textureID);

glBindTexture(GL_TEXTURE_2D, tex.getTextureData().textureID);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);


glBegin(GL_QUADS);
glTexCoord2i(0, 0); glVertex2i(0,   0);
glTexCoord2i(0, 1); glVertex2i(0,   300);
glTexCoord2i(1, 1); glVertex2i(500, 300);
glTexCoord2i(1, 0); glVertex2i(500, 0);
glEnd();

it would be greatly appreciated if anybody could help… :slight_smile:

Hi, it’ll be easier to help if you paste in a minimal example. Something like this: Create an image with ofFbo and ofImage bigger than the screen

To have something to draw, you are loading an image file into an ofImage beforehand I assume?

You are doing glGenTextures above, which would generate a new texture ID that would then have no image data uploaded to it.

One thing you are missing in this code pasting is:
glEnable( GL_TEXTURE_2D );

hi!

…yes I am loading an image in ofImage…

but then I want to draw the texture by using openGl and skip ofTexture and ofGLRenderer…

basically:

ofApp.h

ofImage tex;

ofApp.cpp:

setup:

ofLoadImage(tex, “img.png”);

draw:

/// code needed :slight_smile:

Here is another example I tried: (got it from glfw examples)
//censored for embarrassing code

this is an improvement….

 GLuint text2D;

text2D = tex.getTextureData().textureTarget; //loading image for texture


glEnable(text2D); //Enable texture

glBindTexture(text2D,tex.getTextureData().textureID);//Binding texture

shouldn’t this be working??

text2D = TEST_IMAGE.getTextureReference().getTextureData().textureTarget; //loading image for texture

//

The reason why I am trying to do this. it is because in some scenarios in windows current OF implementation doesn’t draw the texture, so I just want to create the simplest gl version as an alternative

Hi, again without a complete but minimal example it’s hard to know if there is anything else going on somewhere else that is stopping it from working.

One thing that I spot is that unless your texture is initialised as a GL_TEXTURE_RECTANGLE_ARB (0…pixelwidth) instead of a GL_TEXTURE_2D (0…1) then your texture coordinates won’t be correct.

Put a ofDisableArbTex() before you load the image and you will get a GL_TEXTURE_2D

ah. I see…
Thank’s Andreas

I think I got it right,

ofMesh quad;
quad.getVertices().resize(4);
quad.getTexCoords().resize(4);
quad.setMode(OF_PRIMITIVE_TRIANGLE_FAN);

float x; float y; float z; float w; float h; float sx; float sy; float sw; float sh;
    
x =0 ,y =0, z =0 ,w=483, h=374,sx =0, sy=0 ,sw=483,sh= 374;
    
GLfloat px0 = x;		
	GLfloat py0 = y;
	GLfloat px1 = w+x;
	GLfloat py1 = h+y;
	if (ofGetRectMode() == OF_RECTMODE_CENTER){
		px0 -= w/2;
		py0 -= h/2;
		px1 -= w/2;
		py1 -= h/2;
	}

	GLfloat anchorX;
	GLfloat anchorY;
    
    
	px0 -= anchorX;
	py0 -= anchorY;
	px1 -= anchorX;
	py1 -= anchorY;
    
 
    
	GLfloat offsetw = 0.0f;
	GLfloat offseth = 0.0f;
  
    
    
	GLfloat tx0 = 0;
	GLfloat ty0 = 0;
	GLfloat tx1 = 483;
	GLfloat ty1 =374; 
    
    
    
	quad.getVertices()[0].set(px0,py0,z);
	quad.getVertices()[1].set(px1,py0,z);
	quad.getVertices()[2].set(px1,py1,z);
	quad.getVertices()[3].set(px0,py1,z);
    
	quad.getTexCoords()[0].set(tx0,ty0);
	quad.getTexCoords()[1].set(tx1,ty0);
	quad.getTexCoords()[2].set(tx1,ty1);
	quad.getTexCoords()[3].set(tx0,ty1);
       
    
    glEnable(TEST_IMAGE.getTextureReference().getTextureData().textureTarget);
	glBindTexture( TEST_IMAGE.getTextureReference().getTextureData().textureTarget, (GLuint)TEST_IMAGE.getTextureReference().getTextureData().textureID);

    
	quad.draw();
    glBindTexture( TEST_IMAGE.getTextureReference().getTextureData().textureTarget, 0);
	glDisable(TEST_IMAGE.getTextureReference().getTextureData().textureTarget);

now off I go to figure out why it doesn’t draw in that machine…
Something tells me that it is not OF related…

by the way…

Why on earth do we call

glActiveTexture(GL_TEXTURE0);

before drawing inside ofTexture::subsection??

this is the line that causes the crash…

and when removed it doesn’t seem to affect anything…

???

Hi andreas!

hey can you take a look in the following images??

so…

the exact same code,

the same program when executed locally in the machine it works…
when executed from remote desktop.

everything works ok… except when rendering the image…

Apparently remote desktop defaults to openGL 2.2 …right??? …or maybe gl 0.2 :smile:

But shouldn’t openGL 2.2 be able to draw an image with the method we use in OF?

is there another way to draw the pixels so that it renders in 2.2 and can appear in Remote desktop??

maybe glDrawPixels???

tried this:

glDrawPixels(374,483, GL_RGB,GL_FLOAT, TEST_IMAGE.getPixels());
glDrawPixels(374,483, GL_COLOR_INDEX, GL_UNSIGNED_INT, TEST_IMAGE.getPixels());

but no luck, what type is the getPixels encoded ???

ps: my image is ARB without the alpha channel

lol so this

glDrawPixels(TEST_IMAGE.getWidth()-1, TEST_IMAGE.getHeight()-1, GL_RGB,GL_UNSIGNED_BYTE, TEST_IMAGE.getPixels());

draws sort of the image…

I ll try see if it renders in remote desktop and then figure out the minor details…

yeeeeyyyyy!!!

it does!!! :smile:

Ok. so apparently remote desktop defaults to 1_1 not 2.2

anyway…
What I plan to do so that I will be able to use ofImage with remote Desktop and glVersion 1_1 is

add a function drawGlLegacy(ofImage &) inside Glrenderer

add an if statement inside glrenderer and if version is 1.1 then
call the legacy function

here are some minor queries I have:

What is the best way to call the glDrawPixels?

so that I can utilize:

ofTranslate
ofScale
ofRotate
ofColor
blend
alpha channel etc.

just the basic stuff…
I can’t even control where the image is placed in the screen:
any ideas???

glDrawPixels(
TEST_IMAGE.getWidth()-1,
TEST_IMAGE.getHeight()-1,
GL_RGB,
GL_UNSIGNED_BYTE,
TEST_IMAGE.getPixels());

I am a bit iffy for the glDrawPixels… I am wondering what is the reason the texture is not rendered under 1_1
I am suspecting some “wrong” enum in the glTexImage2D?

where and how is the texture allocated initially?

how can I specify GL_UNSIGNED_BYTE and GL_RGB… ???
//I hardcoded them inside ofTexture::allocate - glTexImage2D
but it doesn’t make any difference… :eyeglasses: hmm…

anybody has any info regarding either correct implementation of glDrawPixels or any other way?

Hi, don’t know much about glDrawPixels, apart from it being super old and slow API.

With the OpenGL version defaulting to 1.1 it’s going to be hard to do anything too complicated, is there any particular reason you need to use that particular remote desktop software?

I’ve been looking at setting up a remote dev machine with a decent GPU, after seeing this: http://lg.io/2015/07/05/revised-and-much-faster-run-your-own-highend-cloud-gaming-service-on-ec2.html

One thing is that using Microsoft Remote Desktop software I can’t seem to use the Nvidia GRID card properly, I haven’t tried it yet, but it seems this is not an issue using Teamviewer or something else.

What he is doing is to use Steams home network streaming feature, which is a bit different, but he still needs to log in to the machine and have it not fall back to the default software driver.

1 Like

I have a macPro running windows :dark_sunglasses:
and an “uptime” account that runs the projection and and “admin” account that allows me to make changes and adjustments that affect directly the “uptime” account … the idea is to be able to visually see what images are currently used by the main projection (instead of just text info)

Team Viewer is better but if it crashes you are locked out. RD it is pre-installed by default in every windows machine and cannot crash so a lot more stable than TeamViewer& VNC ) and less intrusive&secure due to its limitations …plus it’s free
. :blush: and… it’s just a lot faster if you are on the same network and just hit that ip.

also it is by default integrated with active directory thus no need to worry about a lot of things.

… i am halfway there
as you can see the image almost draws perfectly,
all that is left is to apply correct positioning in relation to the screen…

…without destroying everything else… :cry:

Hmm, I just encountered this when trying to debug OF programs running on NVidia-equipped Windows from a MacBook running Microsoft Remote Desktop.

A simple 2D OF program that had been started in Windows was showing a blank window, but the logic was continuing to run.

But then running my various working OF programs, leads to exceptions or other stuckness in various places related to graphics & font initialization.

1 Like

what windowing system do you use?
i use glut instead of gflw

gflw had a bug at the time

i remember solving this and being able to draw images under gl 1.1
but never got around to post it here or make a pull request (the code was hacky) fonts , round rect and blending stuff don’t work
udp tcp and pretty much everything else non gl works
including ofdrawbitmapstring

1 Like

by the way i think windows 10 can enable higher gl versions with rdp but never tested this

I’m using GLFW, which either blows up (when trying to set up a window, then tries to close a window at a null pointer and throws and exception) or gets stuck trying to log that it failed to allocate textures for a TrueType font because 64 x 64 is too big for the platform.

If you have a bug and fix, it seems like it’s probably still current.