The accuracy assessment of terrestrial reference frames (TRFs) at coordinate system level is a key task to ensure their successful use in Earth studies, satellite navigation and other geodetic positioning applications. Currently, the TRF quality specifications for the most demanding users dictate that the origin, orientation and scale should be determined at an accuracy level of 1 mm, and they should remain stable over time at a rate of 0.1 mm/yr. To evaluate the conformity of the internal accuracy of modern TRFs to such requirements, an appropriate mapping is needed to convert frame coordinate errors (and their CV matrix) in a terrestrial network to matching errors (and their CV matrix) in the realized coordinate system. Several projection schemes may be considered for this mapping problem, all of which aim at extracting the correlated part of the estimation error in TRF coordinates that is describable by small random perturbations in their coordinate system. The goal of the present paper is to investigate the inference problem of frame accuracy at coordinate system level, and to discuss not only the theoretical aspects of the required covariance projectors, but also the practical impact on the results obtained by their implementation in space geodetic solutions. For this purpose, a relevant case study is performed to evaluate the accuracy of the realized origin, orientation and scale in the ITRF frame series based on the formal CV matrices for their estimated positions and velocities in the four technique subnetworks (DORIS, SLR, VLBI, GNSS).
Read full abstract