In order to be effective in the field, the military trains warfighters to operate its many ground vehicles. The goals of
training are for the warfighter to learn vehicle and weapon operations and dynamics (e.g., how the vehicle and gun
turret work and “feel�) in live tactical situations. Additionally, because many vehicles require multiple operators
(e.g., a gunner and driver), team coordination is an important element of the tactical training.
The military employs both live and virtual reality training to achieve these goals. Live training, especially gunnery,
requires significant facilities and range infrastructure and is also limited to specific sites due to safely restrictions.
Such training events generally require travel/transportation to CTCs and ranges. Unfortunately, live training is
expensive. In this paper, an augmented reality based vehicle training system is presented. The trainees are able to
drive on physical terrain and engage virtual entities for tactical and gunnery training. By augmenting the real world
using virtual entities and effects, along with existing training aids and devices, training anywhere and anytime is
enabled.
The details of the vehicle-borne augmented reality system for augmenting both the driver’s periscope and the
gunner’s remote weapon sight are presented. The system relies on inertial measurements, cameras, and GPS to
provide jitter free, robust and real-time 6-DOF (degree of freedom) pose estimation. These poses are used to render
synthetic targets (e.g., dismounts, technical, target) to the driver and gunner. An iPad style instructor interfaces
controls the augmented engagement and provides student scores.
The system is evaluated on an Army Stryker vehicle operating in a real range. The consistency and quality of target
insertions between the driver’s three augmented periscopes and the gunner’s augmented weapon sights are
compared. The importance of each sensor is evaluated by removing its input and comparing.