Abstract
When tracking small UAVs and drone targets in cloud clutter environments, MWIR sensors are often unable to track targets continuously. To overcome this problem, the SWIR sensor is mounted on the same gimbal. Target tracking uses sensor information fusion or selectively applies information from each sensor. In this case, parallax correction using the target distance is often used. However, it is difficult to apply the existing method to small UAVs and drone targets because the laser rangefinder's beam divergence angle is small, making it difficult to measure the distance. We propose a tracking method which needs not parallax correction of sensors. In the method, images from MWIR and SWIR sensors are captured simultaneously and a tracking error for gimbal driving is chosen by effectiveness measure. In order to prove the method, tracking performance was demonstrated for UAVs and drone targets in the real sky background using MWIR and SWIR image sensors.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have