Intelligent Tutoring Systems (ITS) have yet to reach training effectiveness levels rivaling those of human tutors, partially due to their inability to recognize and adapt to trainee cognitive and affective states. While many studies have examined expensive sensor suites to capture physiological indicators of cognitive and affective states, the authors' previous work presented an innovative conceptual framework for utilizing low-cost sensors to capture specific states in real-time. Such measures are expected to improve an ITS's ability to automatically adapt to a trainee's readiness to learn.
The current set of two experiments aimed to develop real-time classifiers for six distinct affective and cognitive states (anger, fear, boredom, workload, engagement, distraction) utilizing low-cost, non-invasive (neuro)physiological and behavioral sensors. In the first experiment, participants completed a within-subjects, repeated-measures study in which the independent variable was task type - each task was designed to induce a subset of the targeted states. Dependent variables theorized to indicate targeted states included heart rate, postural sway, pupil diameter, and electroencephalography (EEG) band activity. Each metric was captured via low-cost sensor technology. Validated, ground-truth measures of targeted cognitive and affective states were captured via a 10-channel EEG headset and associated algorithms, and a subjective emotional rating tool, respectively. Several challenges were encountered with the low-cost sensors, including limitations in sensitivity to physiological changes and reliability of data collection. Small design and procedural changes were made for the second experiment, and good logistic regression classifiers for the affective states of boredom and fear were obtained. Additionally, logistic model trees showed good generalization capability when validated as classifiers for the cognitive states. This paper presents study results, lessons learned and implications for future research.