Immersive simulation technologies, such as part-task trainers and high-fidelity virtual environments (VEs), play a crucial role in U.S. military training. The military is actively exploring augmented reality (AR), mixed reality (MR), and virtual reality (VR) technologies—collectively referred to as extended reality (XR)—for applications in Distributed Mission Operations (DMO) and the Joint Simulation Environment (JSE).
While the current technology landscape offers diverse ways to interact with VEs, including eye-gaze, head, and hand tracking alongside controller inputs, there is a critical gap in understanding the optimal input modality for training tasks in dynamic VEs. Existing studies comparing virtual input modalities often lack a comprehensive assessment of the best modality and frequently omit MR or control conditions.
In response to this gap, the Air Force Research Lab in Dayton, OH, is actively developing a VE to evaluate the suitability of different interaction methods for diverse training scenarios. This ongoing research evaluates the effectiveness of eight VR input modalities and an MR condition in comparison to a non-virtual (control) condition for procedural training tasks.
This paper details the VE’s development process, covering the modeling of Saitek flight instrument panels for the VE, the creation and integration of eight distinct input modalities, usability testing with different XR head-mounted displays, and a comparative analysis of efficiency, effectiveness, and user satisfaction in completing a procedural training task.
The overarching goal of the research is to offer practical guidance on the utilization of VR and MR devices in dynamic training scenarios with a focus on striking a balance between physical interaction, high-fidelity scenarios, and the enhancement of force readiness. The paper concludes by addressing challenges encountered during the project and presenting best practices and recommendations for the creation of similar VEs. This research contributes to the understanding of optimal interaction methods in complex military training environments.
Keywords
AUGMENTED AND VIRTUAL REALITY (AR/VR);BEST PRACTICES;DESIGN;DISTRIBUTED;EDUCATION;ENHANCING PERFORMANCE;EVALUATION;HEAD-MOUNTED DISPLAYS;HUMAN FACTORS;HUMAN PERFORMANCE;IMMERSIVE;LEARNING TECHNOLOGY STANDARDS;LVC;MIXED REALITY;SIMULATIONS;STEM;TRAINING;USER PREFERENCES;VIRTUAL
Additional Keywords
virtual learning environments, input modalities, Joint Simulation Environment