Intelligent virtual agents (IVAs) represent important components in simulated real-world environments. Significant IVA progress has been made in diverse applications, such as entertainment, gaming, telemarketing, and recently, in military training. The use of IVAs for training is mainly in task collaboration where virtual agents interact with each other or with human users. Typical usage of IVAs in military training is virtual warfare scenarios. IVAs with perceptual capabilities, such as vision or hearing, tend to produce results that are more realistic and, consequently, can improve training task performance. Research and development on perceptual models for IVAs focuses largely on visual perception. However, auditory perception represents one of the most fundamental perceptual aspects for humanlike behavior in a virtual environment because it improves situational awareness by extending the information and feedback envelope beyond the field of view. Therefore, in an event of auditory detection, IVAs should be able to react to other virtual entities or humans participating in a same virtual scenario. In this paper, we will present a perceptual model to predict the auditory detection capability of IVAs. Our study will describe the foundation of this perceptual model, which is based on the auditory filters of the human hearing system. We will also present the simulation framework that was used to implement this perceptual model. When comparing the predicted and observed auditory detection capabilities, the simulation results showed a slight overestimation for the predicted detection thresholds. This overestimation indicated, for the same test conditions, the IVA’s detection capability is generally less sensitive than the human hearing capability. Nevertheless, the model proposed in this paper represents a highly promising method for prediction of auditory signal detection capabilities of IVAs.
Human-Like Auditory Detection Capability for Intelligent Virtual Agents
11 Views