Simulation based training provides not only the benefits of immersion and interactivity during exercises, but also the prospect of automated after action review. As trainees interact with the system and with each other through various interfaces, the resulting body of data can be used to automatically draw instructional conclusions that go well beyond traditional measures of effectiveness. However, complex team training architectures often incorporate or support an entire suite of tools and interfaces with diverse protocols and data conventions. This presents a technical challenge for the development of decision-oriented automated after action review, which can be solved with an abstracted data collection and representation scheme that is compatible with all potential supported interfaces.
This paper describes an agile approach for handling analysis data, developed for the Marine Corps' Combined Arms Command and Control Trainer Upgrade System (CACCTUS). The goals of scalability and modularity target a range of data sources for this application, including the OneSAF Objective System, integrated C4I tools and human-in-the-loop interfaces, and virtual radios on which spoken transmissions are processed with speech recognition tools. Fundamentally the data analyses in a training system depend most on knowledge about the kinds of available data, and less on the collection mechanism itself, which can therefore be abstracted. A consequence is that the data analysis algorithms can be implemented in parallel with the various data collection methods for each integrated tool. Also, for any new requirement to integrate with an additional interface that was previously unsupported, the only implementation requirement is in the data collection code that writes to the repository, with little or no change on the analysis side. This paper provides design details and lessons learned from the CACCTUS effort, and summarizes the more general methodology for abstracting data collection from data analysis in training systems.