Advances in technology have enabled learning organizations to significantly improve knowledge acquisition and assessment. Many tools can be employed to test knowledge and can use the resulting data to pinpoint gaps in knowledge, as well as strengths and weaknesses in our training programs. For skills acquisition, advancement in simulation, serious games, A/R and V/R help to engage learners and optimize the learning experience. However, one significant gap remains: how to assess the performance of observable skills in a way that eliminates bias and delivers actionable metrics.
Assessing skill proficiency for individuals and teams is critically important, yet tools to support objective and consistent skill assessment are difficult to find. Skills assessment today is typically paper-based and relies on the viewpoint of expert assessors, limiting our ability to extract metrics to drive improvement. A wealth of information is lost and there is an incomplete view of learner abilities and skill proficiency.
This paper will provide an overview of new approaches to simulation assessment being used by the Royal Canadian Navy and CSMART, the world's largest civilian simulation training centre for mariners. These simulation centers have employed new technology that helps resolve the inherent challenges which exist in traditional forms of skill assessment. They are also developing insights-based models which ensure transparency and continuous improvement. The developments discussed here are applicable not only in the context of navy training, but also in operational assessment in the navy, the other branches of the armed forces, and more broadly in any context where skill proficiency and protocol adherence is important.
Case Studies: Ensuring Objective, Consistent, and Actionable Evaluation of Simulation
Conference
I/ITSEC 2021
Track
Education
1 Views