Measuring distance to the camera plane given position and orientation in ofxFaceTracker

Hi @kylemcdonald,
since you are the creator of ofxTracker I tag you directly, but anyone else is free to respond of course!

I am playing around with ofxFaceTracker and wonder how I can use (IF I can) position and orientation variables to determine where a person looks (ie where the normal vector of the face plane intersects the camera plane). I assume for simplicity that camera and projection surface can have identical normal vectors (ie lie on the same plane)

I believe orientation vector (as OSC’ed in FaceOSC example) will give me the direction towards where the face is looking, and adding to the position vector will account for the head shift left - right (front - back will also be accounted for by identical triangles). Then intersection to the camera plane will be straight forward to compute.

What puzzles me is how do I find the camera plane. That is, if the origin point is the camera at (x,y,z)=(0,0,0), how is the unit vector setup? ie where does the positive z unit vector look towards compared to the camera itself.Does it point outward in front of the camera ? Can I just assume that and create 3 random points on the XY plane (say P1(-1,-1,0) , P2(1,1,0) and P0(0,0,0) ).

As you can imagine vectors are abit clouded in my mind.
If you have any nice tutorial for vectors and camera planes etc , please forward it.

thank you!

hey @synthnassizer the direction you want to look is the example-calibrated code that ships with ofxFaceTracker.

this shows you how to make a guess about the 3d position of the head based on a calibrated camera. it comes with a generic webcam profile that will get you pretty far. from there you can start making guesses about where in 3d someone is pointed.