Face tracking for additional modalities in spatial interaction

Abstract

A user device receives an image stream from the user side of the user device and an image stream from a target side of the user device. The user device acquires a coordinate system for the user, acquires its own coordinate system, and relates the two coordinate systems to a global coordinate system. The user device then determines whether the user has moved and/or whether the user device has moved. Movement of the user and/or the user device is used as input modalities to control the user’s interactions in the augmented reality environment.

Publication
United States Patent and Trademeark Office
Date
Avatar
Hartmut Seichter
Professor of Computer Graphics