Problem with occlusion - binding images with alpha to 3d objects

I’m loading a png with transparency and binding that to a 3d box.

When viewed from certain angles, the interior of the box is not seen through the transparent part of the texture - instead, the background is drawn.
This changes depending on the camera angle. I’m wondering if this has to do with depth testing and if there’s a workaround?

screenshots:

angle 1, interior occluded:

angle 2, bottom invisible:

my code:

ofEasyCam cam;
ofImage png;

void ofApp::setup(){
	ofBackground(50);

	png.load("img.png");

	ofEnableNormalizedTexCoords();
	ofEnableDepthTest();
}
void ofApp::draw(){
	cam.begin();

	png.bind();
	ofDrawBox(100);
	png.unbind();

	cam.end();
}

there’s no easy workaround for this, the problem is that when using depth testing the graphics card will discard anything that is behind something else that has been drawn already so if the front faces are drawn first the back faces won’t be drawn cause they are behind but if the back faces are drawn first then the front faces will get drawn

one possible solution is to change the order in which things are drawn but in your case being only one object things would be really complicated.

the most tipical solution is to use some kind of defered rendering but the implementation is pretty complex

for your case since you have only things that are completely transparent or completely opaque you could use a fragment shader and write the depth manually to the depth buffer and if the alpha is < 1 then write maximum depth. to write the depth in the fragment shader you would use gl_FragDepth

the default would be:

gl_FragDepth = gl_FragCoord.z

so you could do something like:

if(v_color.a < 1){
    gl_FragDepth = 1000000;
}else{
    gl_FragDepth = gl_FragCoord.z;
}

where v_color is the color passed from the vertex shader