To improve and accelerate pilot training, this paper explores the capture of cognitive/psychophysiological states using biometric sensors and flight telemetry to drive an intelligent human performance assessment system in immersive 3D flight simulation device.
Assessing pilot performance during a training session is a capability that can partially be performed by an AI-based algorithm. With technical data gathered during a flight maneuver, such assessment can provide objectivity during flight training, can be a predictor of future pilot performance, and adapt simulation training using a combination of flight telemetry (technical skills) and biometric/behavioral data (non-technical skills).
Evaluation of non-technical skills remains difficult without the support of data analytics and proper visualization tools. Additionally, soft skills are inherently more difficult to grade compared to technical performance. An AI engine can provide cues on behaviors and cognitive/psychophysiological states that cannot be easily observed by the instructor.
With a cohort of 16 novices, we explored neuroscience capabilities that could enable real-time adaptive flight training using a variety of data collected from a flight training session. By using electroencephalogram (EEG), eye tracking device and flight telemetry data with N-Back, BART & IGT cognitive baseline methodologies in a fast-jet flight simulator with an e-Series Medallion visual, we intend to provide a training scenario & maneuver analysis during initial training for both technical and non-technical flight performance.
Keywords
FLIGHT SIMULATION
Additional Keywords
Neuroscience, EEG, Eye Tracker, Flight Performance