Live field training against a thinking human opposing force – force-on-force training – is highly valued by commanders. However, a limitation of current force-on-force training is the lack of battlefield effects, such as mortar or artillery detonations. This prevents fully employing indirect fires as part of combined arms operations in these exercises. In particular, forward observers have no means to adjust fire if they cannot observe impacts. We describe the development of a prototype system that provides mobile forward observers the visual feedback they need to conduct these operations.
The key emerging innovative technology that enables this training is precision mobile augmented reality. Augmented reality inserts virtual elements into views of real environments. In this application, a forward observer’s position and look direction must be precisely tracked in real-time in order for battlefield effects to appear stably in the correct location. This precision must be maintained as the observer moves between positions. In addition, the effects must be rendered realistically, so they appear to be part of the environment and reflect local conditions, including wind and obscuration by terrain. Forward observers routinely use binoculars to locate targets and adjust fire. Consequently, augmented reality capable binoculars are also required for this task. As an additional challenge, the tracking and rendering for both naked eye and binocular views must be performed on a small, lightweight, body-worn computer compatible with field use. Finally, the system must integrate with an existing LT2 force-on-force training system.
This paper describes the key advances needed to produce the prototype system. We focus in particular on the challenges of extending an earlier prototype designed for use only from fixed positions and not connected to any live training system. The results of initial demonstrations at MCB Quantico are also presented.