Abstract

AbstractThe IRS‐P6 satellite has multi‐resolution and multispectral capabilities on a single platform. A continuous and autonomous co‐registration and geolocation of image data from different sensors with widely varying view angles and resolution is one of the unique challenges of IRS‐P6 data processing. This requires in‐flight geometric calibration of the cameras. In‐flight calibration includes alignment calibration of individual sensors and calibration between the sensors. A method for in‐flight geometric calibration and quality assessment of IRS‐P6 images is presented in this paper. The objectives of this study are to ensure the best absolute and relative location accuracy of different cameras, and the same location performance with payload steering and co‐registration of multiple bands. This is done using a viewing geometry model, given ephemeris and attitude data, precise camera geometry and datum transformation. In the model, the forward and reverse transformations between the coordinate systems associated with the focal plane, payload, body, orbit and ground are rigorously and explicitly defined. System‐level tests using comparisons to ground check points have validated the operational geolocation accuracy performance and the stability of the calibration parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call