3D finger tracking experiment
Last week I setup a Three Gear tracking system to test out it’s abilities. The system does fairly precise tracking of individual fingers and also detects some gestures. It seems similar to what the leap controller advertises but is available now (free to use until the end of November) and uses two kinects instead of custom hardware. I used the pointing gesture data to calculate the screen position pointed at by each index finger. Calibrating the system to the surface of a monitor allows for some interesting 3D touch screen effects.
I think this system could be useful to create table scale interactive experiences. The caveat being that tracking works best when each user goes through a hand scanning/calibration process. The tracking does seem decent when using another person’s hand data and the developers say they are working on a generic implementation that doesn’t require individualized calibration. Here’s a video of some 3d lines and particles driven by the system. The lines are created dynamically in a shader using a technique discussed in this post.