Head Tracking for the Control of Virtual Viewpoint
Direction
Roger A. Browse, James C. Rodger, and Irfon-Kim Ahmad
Abstract
Identifying appropriate roles for the components of advanced interfaces is a significant research
challenge. We expect head movements to assume their natural role in controlling viewpoint, and
we are investigating the use of head tracking to provide perspective control. We apply this to a
task of adjusting the viewpoint to detect a target at the bottom of a virtual cylinder. The cylinder
varies in diameter, height and orientation. We record viewpoint trajectories and elapsed times.
Observers participate in one of two conditions: in the Head-as-Head condition, viewpoint
changes correspond to observing a real scene; in the Head-as-Hand condition, rotational
directions are reversed, simulating manipulation of an object. To evaluate initial learning and
consolidation effects there are two sessions of massed trials, two days apart. The results show a
rapid learning effect, and solid retention over the two day interval. Performance levels are similar
for the two opposite mappings, indicating flexibility in the use of head-controlled viewpoint. We
continue to apply the paradigm to other questions about head-controlled viewpoint manipulation,
such as establishing a boundary between small movements producing natural parallax changes,
versus extended movements involving large scale viewpoint changes.