Abstract
The primary purpose of conducting a test and evaluation event is to collect the data required to evaluate and assess an acquisition system. However, the complexity of a distributed live, virtual, constructive (LVC) simulation environment required to replicate multi-domain operations for an operational test event makes it difficult during both planning and execution to ensure that the data requirements are met. Even after a test team identifies its analytic requirements down to the specific data elements required to address them, it must then ensure that the test plan and environment capture those requirements and continue to do so as a wide variety of test personnel refine and modify the plan leading up to the event. During execution, the test team must be able to characterize the impacts of unplanned changes, such as weather events or instrumentation failures, on the data collection requirements to decide how to react.
In this paper, we describe a comprehensive methodology to maintain data traceability—the ability to associate multiple aspects of data context, lineage, and provenance throughout the test data lifecycle by means of relationships and interconnectedness. This methodology is nested in a larger Distributed LVC Event Process for reducing risk when building complex LVC simulation environments. We will depict the methods and tools we developed thus far to decompose the data context, which we define as the surrounding information and circumstances under which the data is collected, processed, and interpreted. We will then show how we establish the linkages between the many types of data context, e.g., analytic requirements, scenario elements, resources, and event scheduling, to ensure that the complex graph of relationships is captured. We will also describe the challenges and trade-offs associated with the application of our methodology, and our plans for future work.