Successful training of individuals, whether in a live or virtual environment, depends heavily on producing an engaging experience which captures the attention of the trainee. The training system must not only replicate some real life environment, but must also provide trainees with a mentally stimulating experience. However, maximizing user en-gagement involves walking a fine line between boredom and cognitive overload. Further, this line shifts from user to user. An experience that is effective for one trainee may be completely overwhelming or possibly boring to another. To address this variation in user skill levels, the training environment should be able to recognize and adapt to the skill level of each trainee, while still adhering to the training goals of the simulation.
This paper presents work conducted under the Office of Naval Research, combining a user-friendly behavior modeling framework with biometric sensor feedback in order to create simulated entities who adjust their behaviors in response to a trainee's engagement level. This ability for entities within the simulation to adapt their behaviors based on the trainee's physical and mental state provides a mechanism by which the training scenario can constantly adjust itself to maximize its effectiveness. The recognition of user skill level is attained through an evaluation of a trainee's brain state, actions, and decision making in a scenario. This evaluation reports normalized values such as workload and engagement which are used to affect the simulation.
To provide an appropriate adjustment of the user experience, non-player characters in the simulation modify their behaviors to accommodate the current trainee. These behaviors are represented via a graphical task decomposition hierarchy, allowing non-programmers to define the appropriate actions for the entities to perform at any time, taking into account both the situation existing in the virtual environment as well as the engagement level of the player.