Realistic interaction in 3D Virtual Environments (VEs) gives Soldiers a greater task focus (e.g. train as you fight), instead of focusing on the interface or learning training behaviors unrelated to the task. Full-body interfaces include all training interactions, such as navigation, team communication, firing and manipulating weapons, and other relevant domain tasks. For VEs that deal with training and simulation, appropriate interaction mechanisms are critical to providing realistic and immersive experiences that closely mimic real world activities being taught as the computer interface becomes invisible to the user. In particular, dismounted Soldier training requires a VE experience that closely resembles the real world; not only in terms of visual fidelity, but also in terms of the actions Soldiers need to perform. In this paper, we present a prototype system for dismounted Soldier training, where users can navigate through a virtual environment using natural full body motions. Soldiers can run, jump, crouch, and turn in the VE while still holding their weapon by simply using their bodies, mimicking the motions they would normally do in the real world. Our system makes use of video game console motion controllers including the Microsoft Kinect, Playstation Move, and Nintendo Wiimote, combined with the Unity 3D game engine to support untethered interaction. Our system makes use of a set of heuristic recognition rules that recognize various actions taken from the Kinect's depth image 3D skeleton representation. These rules support seamless transitions between realistic physical interactions (e.g., actually walking and running) and proxied physical interactions (e.g. walking and running in place) that support locomotion in the larger VE. We discuss the details of our prototype and provide an informal evaluation using subject matter experts.