Fidelity requirements defined by users provide valuable insight into the fidelity needed to ensure that trainees ‘buyin’ to the simulator as a training device. However, there are no empirical data to support a relationship between trainees' perceptions of a simulator's training effectiveness and actual training effectiveness. Our preliminary research revealed a discrepancy between pilots' perceptions of the effectiveness of the simulator as a training device and objective in-simulator performance results (Estock, Alexander, Stelzer, & Baughman, 2007). For this paper, we conducted additional analyses to determine whether a similar discrepancy exists between pilots' perceptions of training effectiveness and objective training effectiveness results. Specifically, we conducted an experiment in which 43 U.S. Air Force F-16 pilots flew air-to-air training research missions. During the experimental trials, two pilots flew in high-fidelity F-16 simulators with a 360° field of view (FOV), and two pilots flew in lower-fidelity F-16 simulators with a 108° FOV. Both before and after these experimental trials, all pilots flew benchmark missions using only the high-fidelity simulator. To obtain objective assessments of the training effectiveness of each simulator, we compared the two groups on their change in performance on air-to-air skills from pre-to post-training benchmark missions. To obtain subjective assessments of the training effectiveness of each simulator, we administered a questionnaire to all pilots immediately following the experimental trials. We focused on the effectiveness of each simulator in training a set of air-to-air skills most likely to be influenced by the FOV differences between the two simulators. We compared trainees' perceptions of training effectiveness with objective training effectiveness results. The findings of this study replicated the findings of our previous study in that we found a discrepancy between pilots' perceptions and objective results. We discuss the implications of these findings for the verification, validation, and accreditation (VV&A) of training simulators.