I’m developing an application using openframeworks, in which the user is wearing a virtual reality glasses and see’s a scene with augmented reality (virtual objects mixed with reality)
These glasses have a gyroscope that provides the user’s head orientation in the angles yaw, pitch and roll. The point is to use those angles to rotate the the camera accordingly, which is necessary to correctly update all the virtual objects.
My problem is that the camera doesn’t rotate in its local referential, and the world seems to rotate according to a global referential, so the application isn’t emulating correctly the user’s head movement. Only when the referentials happen to coincide (when user’s head is looking forward), the camera rotates correctly.
I know this is quite hard to understand, but I’m hoping to get at least some guidance on how to tackle this issue.
how are you rotating the camera? the yaw, roll, pitch methods should work like you expect but they are relative to the current rotation so you need to reset the transformations in the camera before applying new values.
also if this is for the occulus rift there’s an addon already which should apply the correct transformations: https://github.com/obviousjim/ofxOculusRift
I have a ofEasyCam object, and use the function _cam.setOrientation(ofVec3f(_pitch, _yaw, _roll)), using the rotation data I receive from the head tracker.
Unfortunately the glasses aren’t the Oculus Rift but they work the same way, I guess
i would use an ofCamera instead since ofEasyCam is meant to be used with mouse/keyboard
that should work, also be aware that all the angles in OF are expressed in degrees so if you are getting radians from the gyroscope you might need to convert them
also try calling resetTransform() before setting the orientation