Abstract

The pattern recognition and computer vision communities often employ robust methods for model fitting. In particular, high breakdown-point methods such as least median of squares (LMedS) and least trimmed squares (LTS) have often been used in situations where the data are contaminated with outliers. However, though the breakdown point of these methods can be as high as 50% (they can be robust to up to 50% contamination), they can break down at unexpectedly lower percentages when the outliers are clustered. In this paper, we demonstrate the fragility of LMedS and LTS and analyze the reasons that cause the fragility of these methods in the situation when a large percentage of clustered outliers exist in the data. We adapt the concept of to formulate an improved regression method, called the least trimmed symmetry distance (LTSD). Experimental results are presented to show that the LTSD performs better than LMedS and LTS under a large percentage of clustered outliers and large standard variance of inliers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call