Since 2006, the Serious Games Showcase & Challenge (SGS&C) has encouraged student, government and commercial game developers to submit their serious games for review by a panel of military, academia and industry gaming experts. For the SGS&C, an entry is considered a serious game if it incorporates game play dynamics in a product to educate or train a learner at any stage of the learning continuum from middle school through adult. A wide range of information is collected when a product is submitted to the SGS&C and as each entry navigates through the evaluation process. In 2012, we analyzed and presented data provided in the submission forms, videos and playable games provided by entrants. This data gave us insight into the technologies, subject matter addressed and techniques developers have applied to their existing projects (McNamara, Smith, Gritton and Smith, 2012). However, the original paper stopped short of sharing data regarding what evaluators found important in the design of a best-in-class serious game. This paper analyzes the data provided annually via the SGS&C standard evaluation rubric and unstructured comments provided by the SGS&C evaluator pool, comprised of over 70 professionals representing users, developers, and researchers of best-in-class serious games. Previously, the rich evaluative data collected has only been shared with individual entrants to provide them feedback from the process and possibly inspire improvements in their games. By analyzing the evaluator comments across a series of dimensions, we've identified practices that are relevant to anyone interested in the application or development of serious learning games. This paper maps evaluator comments to identify successful instructional interventions, design principles and technology factors that translate into best practices for serious games designers and developers.