Modern advances in Graphics Processing Units (GPU) technology and the science of 3D computer graphics have stirred interest in the use of these platforms and algorithms to perform real-time simulation of radar antennas, radar signal processing techniques, and radar data output. Applications include aircrew and sensor operator training systems, knowledge-aided clutter suppression, hardware-in-the-loop testing, and gaming. Often, the type of data to be simulated is the receiver signal amplitude as a function of range and azimuth angle (or sometimes range and Doppler shift). Such data is required for generating radar images, such as SAR or Real Beam, and for performing detection routines for use in tracking algorithms. However, the GPU pipeline technique (including rasterization and/or ray tracing) presents a technical challenge: the focal plane camera approximation of a 3D graphics "receiver" senses the scene in a perspective projection space, while radar senses the scene in a polar or spherical space. The nonlinear transformation from the perspective projection space to a polar space decreases range resolution at longer ranges, while it is known that true radar can maintain nearly constant range resolution throughout its range swath. This undesired effect is amplified under the conditions of large azimuth and/or large elevation coverage of the camera's view frustum in large-area scenes such as Real Beam images from monostatic radar antennas. This paper mathematically explains the problem and develops a novel solution that is then numerically demonstrated through simulation: the oblique camera frustum. The technical breakthrough presented in this paper is demonstrated using the Unity simulation platform. The work was developed under a Small Business Innovation Research (SBIR) Phase II project awarded at the inaugural Air Force Sim Pitch Day event at I/ITSEC 2019.
Advanced Real-Time Radar Simulation: Maintaining Range Resolution for Large-Area Ground Maps
Conference
I/ITSEC 2021
Track
Simulation