Training and simulation innovators in the DoD are beginning to explore the substitution of Virtual Worlds (VWs) into the space traditionally occupied by large-scale trainers and simulators. Moreover, some services (notable the U. S. Air Force) are beginning to look toward VWs for operational use. VWs are significantly cheaper to author and create content for; easier to proliferate to a wider audience; and cheaper to maintain and support both in terms of hardware and software. This has positive implications not only for team training, mission rehearsal, weapon system-specific training, and after action review, but also for mission operations and control.
One area where VWs are in their infancy is in their ability to link to live data feeds, affect events in the real world, and allow rich bi-directional interaction with external resources. The ability to incorporate dynamic simulation elements such as sensors (real and simulated), semantic tagging of objects, and convincing virtual role players will greatly enhance the richness and utility of military training research and engineering offerings to their customers in both kinetic and nonkinetic domains.
In this paper we describe our recent and ongoing work in developing a general approach to incorporating externally driven behaviors and events into virtual worlds to give VW authors access to a much richer set of simulation elements and our progress in incorporating sensing (simulated and real) into a generalized architecture and reference toolkit. This work—coupled with the existing Second Life affordances for voice communications, rich media inclusion, rapid content creation, server intermediation, content distribution and replication, content scripting, and creation of virtual role players—offers the promise of tactical training that can be extended in both scale and scope and offered on increasingly distributed, lightweight platforms., and may offer new capabilities for mission rehearsal, after-action review, and tactical operations.