Abstract

The concept of reliability was introduced into geodesy by Baarda (A testing procedure for use in geodetic networks. Publications on Geodesy, vol. 2. Netherlands Geodetic Commission, Delft, 1968). It gives a measure for the ability of a parameter estimation to detect outliers and leads in case of one outlier to the MDB, the minimal detectable bias or outlier. The MDB depends on the non-centrality parameter of the \(\chi ^2\)-distribution, as the variance factor of the linear model is assumed to be known, on the size of the outlier test of an individual observation which is set to 0.001 and on the power of the test which is generally chosen to be 0.80. Starting from an estimated variance factor, the \(F\)-distribution is applied here. Furthermore, the size of the test of the individual observation is a function of the number of outliers to keep the size of the test of all observations constant, say 0.05. The power of the test is set to 0.80. The MDBs for multiple outliers are derived here under these assumptions. The method is applied to the reconstruction of a bell-shaped surface measured by a laser scanner. The MDBs are introduced as outliers for the alternative hypotheses of the outlier tests. A Monte Carlo method reveals that due to the way of introducing the outliers, the false null hypotheses cannot be rejected on the average with a power of 0.80 if the MDBs are not enlarged by a factor.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.