Textures in real time

Hi there!
I’ve an annoying problem… I need to fill some blobs with real-time generated textures.
I’ve tried it, but i’m getting a indecent framerate (about 2-3 frame per sec).
My idea was to generate the textures in a separate thread while the application processes other stuffs and then, on the next frame, apply them to the blobs and so on.
It could work?
I red somewhere that i can’t draw textures outside the main thread, due to opengl limitations. Is it true?


yes, you should use openGL only from the main thread. Can you post some code? probably you are doing something that could be done faster.

Hi Arturo. I’ll try to explain better the procedure i use to.
My main goal is to mix two colored blobs (A & B) to obtain a new one © with the derivate color from A and B.
And here’s the problem. We would like to use juxtaposition to make the new color. So i thought: ok, i know the area from A and B blobs, i know the resultant blob area ©, i can make a new texture, filling every pixel according to the firsts two blobs percentage of colour and apply it to the resultant blob.
What i actually do is:
Create a texture where the dimension is the same of the C blob.
Cycle through every pixel and assign a color for each one according a filling algorithm i’m developing.
Then i draw the texture on the screen…
Another problem is i have to manage something about 100 blobs per frame… :frowning:

I noticed that i use just one core to do that (working at max, but just one core).

If you want i can post some code but at the end it does this (and i know it’s a pretty brutal approach… :expressionless: ).

ty, ed.

Depending on how you are mixing colors you probably can just use some blending, take a look at ofEnableBlendMode and the blending example

Or you could use a shader to make the mixing of colors in the gpu which would be way faster than doing it in the cpu.

Again, if you post some code it’ll be easier to help

Uhm ok, i’ve took a look at the blending example, really cool. I want to use the subtractive mixing mode.
What i would like to obtain, mixing blue and yellow for example, it’s the mixed_colour.png attached to this post.
And the problem with the blending is that it mix colors but the derivate resultant is a uniform color.

Some code below:

int width = blob.boundingRect.width;  
int height = blob.boundingRect.height;  
unsigned char * coloredTexTest = new unsigned char[width*height*3];  
ofTexture texture;  
texture.allocate(width, height, GL_RGB);  
for (int i = 0; i < width; i++){  
		for (int j = 0; j < height; j++){  
			if (ofInsidePoly((float)i, (float)j, blob.pts)) {  
                                   fill_col =    call some functions that returns an ofColor;  
				coloredTexTest[(j*weight+i)*3 + 0] = fill_col.r;	// r  
				coloredTexTest[(j*weight+i)*3 + 1] = fill_col.g;	// g  
				coloredTexTest[(j*weight+i)*3 + 2] = fill_col.b;       // b  
                                // fill with black  
				coloredTexTest[(j*weight+i)*3 + 0] = key_b.r;	// r  
				coloredTexTest[(j*weight+i)*3 + 1] = key_b.g;	// g  
				coloredTexTest[(j*weight+i)*3 + 2] = key_b.b;     // b  
texture.loadData(coloredTexTest, width, height, GL_RGB)  

At the end, calling this for each blob, i have an array of texture and i should put them together, drawing the result.

If you need more code or explanation please ask, i hope i was clear enough.


the fastest (to write and to execute) way is probably a fragment shader; if for some reason you don’t like GLSL, you could use blending + a static mask texture.

Uhm, sounds interesting… Do you have some links to documentation about the functions used into the example?
Because i don’t really got the point reading the code from it and i didn’t find anything about it into the docs .
Thank you so much,

Ok I’ve passed the last hours studying a little bit… What i understand is that writing my fragment shader i could choose the colour of every single pixel… But, what if i would colour different blobs with different colours?
Blobs are tracked and each one has different attributes. Can i pass parameters to the shader such as blobs centroid or area?


you can pass parameters to the shader using uniforms using ofShader::setUniform*(). if you want to pass something standard like color or texture, you should use built in facilities for that. ofMesh has addColor() and addTexCoord() for color/texture.

Here’s a fragment shader you might try playing around with to get you started:

uniform sampler2DRect tex;  
uniform vec2 center;  
uniform float length;  
void main()  
	 // draw a circle  
	float d = distance(gl_TexCoord[0].xy, vec2(center));  
	if(d > length)  
		gl_FragColor = vec4(1, 0, 0, 1);  
		gl_FragColor = vec4(0, 1, 0, 1);  
	/* // draw a sphere  
	vec2 relativePosition = gl_TexCoord[0].xy - vec2(center);  
	if(length(relativePosition) > length)  
	float z = sqrt((length*length) - (relativePosition.x* relativePosition.x) - (relativePosition.y*relativePosition.y));  
	vec3 normal = normalize(vec3(z, relativePosition.y, relativePosition.x));  
	gl_FragColor = vec4(normal, 1.0);  

You can have multiple centers (and draw multiple circles) easily by just adding more vec2 objects. Set the
center and length by using

setUniform2f("center", someOfVec2f);  
setUniform1f("length", theLength);  

Have fun!

you can also use ofPath to obtain the intersection between 2 blobs, for that you need to specify the winding mode:

to obtain the intersection you need to use OF_POLY_WINDING_ABS_GEQ_TWO. Then tessellate that path into a mesh, and use a frag shader to generate the mixed pattern only in the ofMesh that represents the intersection

ofMesh intersectionMesh;  
void update(){  
ofPath intersection;  
for(int i=0; i<blob1.size(); i++){  
for(int i=0; i<blob2.size(); i++){  
intersectionMesh = intersection.getTessellation();  
void draw(){  

Thank you very much guys, now I have a lot of points where to start and I’ll choose one of them to continue the project. I guess i have to study a lot in those days! :slight_smile:
I’ll keep you guys informed about further developements… :wink:

cheers, ed.