3hreeSpace, a method for 3D tracking using two cameras/webcams

For a project called “Klangkubus” (“sound cube”) two cameras (Logitech C270) were used for 3D-tracking. One camera captures the tracking area from front view and the other one from top view and after analyzing both images for changed RGB values an intersection of front view and top view is used to define areas of movement in the tracking area. As this tracking method might be useful in other cases as well, the functions got packed into an addon, that takes two camera streams and returns 27 values for 27 areas of movement (= triggers) in space.
The project mentioned at the beginning – the “Klangkubus”, an interactive emptiness that acts like an invisible sample player – gives an example for usage and two videos – a sound performance and a dance performance – show the tracking function; and the addon plus example could be found at github.
The addon is a modified and commented cut out of the code used for the project so there are currently only 27 tracking values possible; but maybe there is a better way of getting values to recieve more or less values (a note is added at github under “Known issues”)?
All the best

1 Like