Many opportunities are available for training that involves participation as either individuals or teams in competitive events. Cyber security has proven conducive to this form of training. In competitive cyber security exercises, participants are usually provided with standardized hardware and software, including various software tools for cyber forensic analysis. Generally, performance is assessed on the basis of points awarded for completing challenges presented to the participants. Ideally, through thorough instrumentation of the software environment, instructors and test coordinators would be provided with detailed data concerning the performance of individual students, as well as their unique training needs. The research described here provides an illustration of such instrumentation implemented within the context of a competition-based cyber security exercise (Tracer FIRE). The study considered factors that contributed to successful performance within the competition. Emphasis was placed on the use of software tools by participants, including tools provided by the exercise coordinators and tools acquired online by participants during the event. Resulting findings provide the basis for recommendations to competition coordinators regarding key facets of the software environment and cues that individual participants are struggling and there is need for training intervention.
Factors Impacting Performance in Competitive Cyber Exercises
3 Views