geometry shader kills performance

hey everyone,

I came up with a little geometry shader (my first one, weee! thx for the example memo) to cull the stretched out polygons from 3d kinect meshes.

I was feeling pretty proud of myself until I tried it on my desktop, where the app which was running at 40fps crawled to about 2fps.

I switched the shader out to normal pass through geometry shader, and it rose to a measly 8fps. Seems like the performance should be almost the same as with no shader enabled!?

The desktop has an ATI Radeon 4870 in it…

here is the geometry shader:

#version 120  
#extension GL_EXT_geometry_shader4 : enable  
float normalz(float zed){  
	return (zed - 400.)/5000.;  
void main() {  
	float nz0 = normalz(gl_PositionIn[0].z);  
	float nz1 = normalz(gl_PositionIn[1].z);  
	float nz2 = normalz(gl_PositionIn[2].z);  
	if( (abs(nz0-nz1) < .02) &&   
		(abs(nz1-nz2) < .02) &&   
		(abs(nz0-nz2) < .02) )  
	  for(int i = 0; i < gl_VerticesIn; ++i) {  
		gl_Position = gl_PositionIn[i];  
		gl_FrontColor = gl_FrontColorIn[i];		  
		gl_TexCoord[0] = gl_TexCoordIn[i][0];  

I’m drawing the vbo mesh with mesh.drawFaces(). there are 640*480 verts and normal indeces. the geometry shader is setup for triangles in -> triangle strips out.

I hope it’s just not that the card can’t handle geometry shaders…

i’m pretty sure it’s like memo and arturo said on twitter – ati drivers only support geometry shaders in software :frowning:

yup ATI drivers on Mac OSX do geometry shaders in software (even though hardware supports it). NVidia should be fine (hardware), also any NVidia or ATI on windows should be fine (hardware) - and apparently, ATI on 10.7 should be hardware accelerated too finally! (5 years after the introduction of geometry shaders, finally OSX ATI drivers for it!)