The Battlefield Augmented Reality System (BARS) is a mobile augmented reality system that displays head-up battlefield intelligence information to a dismounted warrior. BARS consists of a wearable computer, a wireless network, and a tracked see-through Head Mounted Display (HMD). The computer generates graphics that, from the user's perspective, appear to exist in the surrounding environment. For example, a building could be augmented to show its name, a plan of its interior, icons to represent reported hazard locations, and the names of adjacent streets.
The full power of mobile augmented reality systems is realized when these systems are connected to one another, to immersive virtual environments, and to remote information servers. These connections are made through wireless devices that cannot guarantee connectivity and may have highly constrained bandwidth. Based on these constraints, we present a robust event-based data distribution mechanism for mobile augmented reality and virtual environments. It is based on replicated databases, pluggable networking protocols, and communication channels.
For use in simulation and training exercises, we have been working with U.S. Army RDECOM to create an interface between this data distribution mechanism and a Semi-Automated Forces (SAF) system. With this interface, the BARS user appears as a dismounted warrior in the SAF system--the BARS user's position and orientation are fed to the SAF system, and the state from the SAF system is sent back to the BARS user's display. Connected to a SAF system, BARS technology creates a training system that works in a real location (as compared to a virtual reality simulation) to make simulated forces appear to exist in and interact with the real world.