Interaction within a virtual environment is a key factor relating to a positive user experience. The current industry standard for interaction and character control is via mouse and keyboard or a game controller. Unfortunately, neither of these input mechanisms represents a natural input modality for a trainee. In the training domain, this disconnect can potentially result in negative training transfers and user frustration. Fortunately, recent advances in gesture and motion based control have opened up new avenues for intuitive and natural control schemas. Through a current research effort, the Tactical Combat Casualty Care Simulation (TC3Sim) serious game has been updated to use the Kinect motion sensor as an input device. The integration of the Kinect interface with a medical training game has revealed significant findings regarding its usability and performance. The research has also found general guidelines for the use of a natural user interface regarding poses, body positions, dynamic gestures, and menu interactions. This paper details these guidelines as well as the benefits and drawbacks of a natural user interface for a serious medical game. The changes made to the user interface to overcome many of those drawbacks are presented, including simplification of the control schema and certain game play alterations. Results of an initial usability study are also presented, focusing upon user interaction with the game and satisfaction metrics to identify the feasibility of the Kinect sensor in other medical simulation efforts.