Abstract
Modern command and control systems depend on surveillance subsystems to form an overall tactical pictures. The use of sensors with different capabilities can improve the quality of the aggregate picture. However, the quality of the fused data is highly dependent on the quality of the data that is supplied to the fusion processor. Before the fusion process takes place, sensor data has to be transformed to a common reference frame. Since each individual sensor's data may be biased, a prerequisite for successful data fusion is the removal of the bias errors contained in the data from all contributing sensors. In this paper, a technique is developed to perform absolute sensor alignment (the removal of bias errors) using information from moving objects, such as low earth orbit satellites, that obey Kepler's laws of motion.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.