top of page

In this video I explore beyond the norms of the the Live Link Face feature and demonstrate how it can be used for controlling movement of a first person perspective, using eye tracking. Just, Look and Go! Apart from pressing z to get forward movement going, I do not touch any other input until getting close to the kitchen, near then end of the video. Watch until the end to see me mess up after turning on the tap in the kitchen, to see how the tracking tries to keep up with me looking away completely.

bottom of page