Instructors often assess training effectiveness using subjective evaluation tools. The use of evaluation by Subject Matter Experts (SMEs) assumes that the experts can distinguish between small but meaningful differences in the measured domain. Subjective evaluations by experts provide both an efficient and effective means of identifying the strengths and weaknesses of the assessed entity. In the area of simulation development, SME assessments evaluate the training capabilities of systems, identify deficiencies, and compare the relative impact of the various deficiencies. This paper presents methods that utilize subjective assessments from SMEs and compares SME ratings of Mission Essential Competency (MEC) experiences with objective performance measures. The methodology entails mapping the correspondence between MECs and objective performance measures. Additionally, we mapped performance measures to training scenarios in order to determine the appropriate skills for evaluation. This study uses performance measures based on the capabilities of the simulators in our laboratory. The congruence of the subjective evaluations by experts and objective simulator performance variables provides validation for the use of subjective assessments completed by experts. The results provide a strong framework for building an understanding of the relationship between subjective and objective performance data to measure training effectiveness.