outputting 3d compatible with 3d hardware

hi, not quite sure how to put this. i’d like to create a 3d gui which can work with a 3d projector or 3d television, my plan is to then use xbox kinect for the user to control the 3d gui with gestures from their hands, giving the impression theyre actually touching things in the real world.

can this be achieved in openframeworks?

never paired a 3d tv with a computer, so i don’t know about their drivers etc, but what you’re trying to do is totally feasible; it should not be very different from what I did here: http://geekjutsu.wordpress.com/2010/12/02/dar-deep-augmented-reality/

If you use Nvidia 3D Vision, the driver will change the pipeline of OpenGL and will take care of the 3D Stereo.

Has anyone actually got this working?

You might need 3D Vision Pro or a Quadro graphics card (or both). I think the basic 3D Vision package only supports DirectX and not openGL.

http://forums.nvidia.com/index.php?showtopic=198902

I tried to run a few openGL examples in processing/oF using 3d Vision running Windows. The graphics card (Ge Force FX 470) would always lock up when running fullscreen on a 3d TV. I am going to look around a bit more on the NVIDIA forums.

I might wind up using Unity3d instead of oF. I think Unity can output openGL and DirectX for 3D.

sweet, good to know its happening. naus3a your deep augmented reality is pretty much what i had in mind, i think the only difference would be the user has on 3d glasses and is interacting directly with what they see. distance from the screen may be an issue though, im not sure how far away you need to be from kinect in order for it to track all 3 dimensions, and how far you need to be from the screen in order for the 3d effect to work for your eyes. ill probably keep it all in the design stage and keep it as possible future implementation.