Abstract

Microwave imaging technique allows obtaining images of hidden objects in structures and media using microwaves. This technique has various applications such as: nondestructive testing, medical imaging, concealed weapon detection, through-the-wall imaging, etc. Obtaining radar images in these applications is based on processing phase and amplitude of the reflected signal recorded over an aperture (a microwave hologram). Recently, systems began to appear in which the radar part is supplemented by an RGB-D sensor, which allows to obtain new capabilities. For example, there is the a microwave screening system architecture in which inverse synthetic aperture is formed by the natural motion of the subject in the vicinity of a stationary linear antenna array. The microwave system is complemented with an synchronous RGB-D video sensor which captures the trajectory of the moving subject in 3D and allows coherent processing of the radar signal. Another system detects objects buried under irregular surface and uses RGB-D sensor for capturing the surface relief for suppressing reflection of the sounding signal from the surface. Calibration between a radar and an RGB-D sensor is an essential process for microwave and optical data fusion. This article presents a novel approach for calibration, using a planar calibration target which is made of radiotransparent material (such a foam plastic sheet) with square marker and six small metal balls embedded in target surface and representing point objects. The proposed method exploits 3D-3D correspondences between coordinates of point objects in two coordinate systems associated to the sensor and to the radar. One points set is extracted from optical data, using marked corners of the target as a base points. Second points set is obtained from microwave data as local maxima of 3D volume of data reconstructed from one-frequency microwave hologram. Computer modeling were performed using Autodesk 3ds Max software by which models of all components of the system were built and the optical image from the sensor was modeled. Test experiments were carried out using a measurement system composed of the following components: a compact vector network analyzer (VNA), two mechanical scanners with stepper motors, one transmitting and one receiving horn antennas, mounted on the VNA, an RGB-D sensor, a microcontroller board, and a computer. The high accuracy of the method is confirmed both by computer modeling and physical experiment. The accuracy of determination of relative position between the radar and the sensor is about one fifth of the signal wavelength used.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call