Mixed reality methods are currently being deployed on the Army's Synthetic Training Environment (STE) Reconfigurable Virtual Collective Trainer program to augment spatially correct high-resolution stereoscopic views of physical controls, displays and pilot hands with out-the window, NVG and IHADSS sensor VR views in the STE reconfigurable aviation platforms. The implementation provides a seamless combination of the functional controls and views effective for training all simulated conditions including day and night missions while isolating the pilots from room lights and external activities that may be present in the physical surroundings.
The implementation was developed through an iterative Agile process involving Government input and Soldier touchpoints that refined the user requirements. This necessitated technology insertion that improved the training capability as new products with essential improvements emerged. The paper presents the training requirements, design tradeoffs, mixed reality system evolution, improvements that were achieved during the development process, system design of the fielded systems, and performance characteristics. Photos are included of the training systems and mixed reality views. Discussion of the limitations, various MR system capabilities, soldier feedback and next steps are also included.
Keywords
AUGMENTED AND VIRTUAL REALITY (AR/VR);EMERGING TECHNOLOGIES;FLIGHT SIMULATION;HEAD-MOUNTED DISPLAYS;IMMERSIVE;MIXED REALITY
Additional Keywords