hi, not quite sure how to put this. i’d like to create a 3d gui which can work with a 3d projector or 3d television, my plan is to then use xbox kinect for the user to control the 3d gui with gestures from their hands, giving the impression theyre actually touching things in the real world.
I tried to run a few openGL examples in processing/oF using 3d Vision running Windows. The graphics card (Ge Force FX 470) would always lock up when running fullscreen on a 3d TV. I am going to look around a bit more on the NVIDIA forums.
I might wind up using Unity3d instead of oF. I think Unity can output openGL and DirectX for 3D.
sweet, good to know its happening. naus3a your deep augmented reality is pretty much what i had in mind, i think the only difference would be the user has on 3d glasses and is interacting directly with what they see. distance from the screen may be an issue though, im not sure how far away you need to be from kinect in order for it to track all 3 dimensions, and how far you need to be from the screen in order for the 3d effect to work for your eyes. ill probably keep it all in the design stage and keep it as possible future implementation.