Observers of all types are major force multipliers on the modern battlefield. However, with today's constrained budgets, the range time and training resources to support observer training, including munitions, supporting artillery and mortar units and aircraft sorties, are increasingly scarce. Augmented reality is an innovative technology that can supplement live training to address the challenge of affordably training Forward Observer, Joint Forward Observer, Joint Tactical Air Control and similar skills. We present the system design, hardware, software, algorithms and initial field results for a prototype augmented reality training system.
As part of this research, augmented reality devices for the unaided eye, binoculars, laser rangefinders and designators were developed. The augmented reality devices interface with real tactical equipment including the Defense Advanced GPS Receiver (DAGR) and the StrikeLink digital CAS system. The augmented reality system enables long range high precision live augmentation of both unaided eye and magnified imagery with aerial and terrain based synthetic objects, vehicles, people and effects. The inserted objects appear stable in the augmented reality device as the user pans through the battle-space. We present the navigation algorithms that use cameras in combination with an IMU and GPS to provide jitter free, robust and real-time 6-DOF pose estimation for precise augmentation.
We also present details of the rendering and simulation modules. We have developed a Unity based augmented reality rendering system that is part of an HLA federation that includes the USMC Deployable Virtual Training Environment, JSAF and an tablet-based Instructor Tablet. We will present the initial results of the system operating on an USMC range in which augmented fixed and rotary wing aircraft, ground vehicles, and weapon effects are combined with real world scenes.