The government, education and business sectors have a common and critical need to dramatically accelerate the transition of employees from novice to expert and improve human performance. Accelerated learning and learner engagement are two areas of interest with few conclusive reports but great potential to drive business results. Recent investigations within military-relevant training environments suggest that electroencephalography (EEG) signatures of attention, memory, and workload can be validly assessed during learning and linked to learner engagement. The concept behind these measures is to track—in real time—perceptual, attentional, and cognitive workload states that would indicate increased learner engagement and performance. As part of a research project with the Air Force Research Laboratory, TiER1 Performance Solutions investigated principles of learning acceleration in the context of a training course for teaching supervisors how to detect and respond to cyber insider threats. This paper reports on two recent experiments conducted to investigate the design of TiER1's accelerated learning approach and improve our product. The first study compares the TiER1 accelerated training system (e.g., custom learning pathways, quizzes, game-based interaction, and feedback) to conventional multimedia training, while the second evaluates using objective psychophysiological measures to detect cognitive state changes of learners who receive the accelerated training system versus those who train using equivalent modules of conventional multimedia training. Results from the first study showed very experienced learners could test out of content they already knew. However, post-test assessment scores showed no significant differences between the two groups. In the second study, psychophysiological measures relating to cognitive workload were better able to demonstrate differences between the traditional and the game-based training material than performance measures alone. These objective measures of cognitive workload were also supported by subjective report data (e.g., NASA-TLX), further demonstrating the utility of using such measures when evaluating serious game design.