Advances in Augmented Reality are turning “train like you fight,� into “fight like you train,� where simulation and synthetic imagery can be inserted into battlefield operations to remotely operate systems with enhanced situation awareness and decision making. VR Rehab with Old Dominion University and Lockheed Martin, in collaboration with the Army’s Natick Soldier Research, Development and Engineering Center (NSRDEC) have developed Fused Augmented Realities with Synthetic Vision (FAR/SV) which merges simulated terrain and graphics into operational interfaces. This paper will describe FAR/SV successful R&D investigations enhancing the situation awareness and decision making by warfighters controlling small unmanned aerial systems and small unmanned ground vehicles under OSD/NSRDEC sponsorship. This paper describes the foundational science, agile development efforts to overcome challenges, and data from empirical studies of usability and mission performance. Two successful FAR/SV innovations are presented. First, to overcome long-standing problems of ‘looking through a straw’ high-magnification viewing, we surround the actual live/sensor view with a correlated 3D synthetic vision wider field of view for enhanced situation awareness. Second, to overcome degraded visual conditions, we semi-transparently blend the actual live/sensor view with the underlying correlated 3D synthetic vision terrain. Additional operational benefits derive from our FAR/SV interface innovations where users perceive they are adding 2525 icons and other annotations directly to the video/sensor imagery; where ‘under the hood’ we are anchoring the icons and annotations persistently to our underlying correlated 3D terrain. FAR/SV supports its use as a standalone App running under Windows or Android, as well as an add-on module for existing video and sensor imagery viewing applications.