Abstract

Abnormality detection in medical images is a one-class classification problem for which existing methods typically involve variants of kernel principal component analysis or one-class support vector machines. However, existing methods rely on highly-curated training sets with full supervision, often using heuristics for model fitting or ignore the variances of the data within principal subspaces. In contrast, we propose novel methods that can work with imperfectly curated datasets using robust statistical learning, by extending the multivariate generalized-Gaussian distribution to a reproducing kernel Hilbert space (RKHS) and employing it within a mixture model. We propose a novel semi-supervised extension of our learning scheme, showing that a small amount of expert feedback through high-quality labeled data of the outlier class can boost performance. We propose expectation maximization for our semi-supervised robust mixture-model learning in RKHS, using solely the Gram matrix and without the explicit lifting map. Our methods incorporate optimal component means, principal directions, and variances for abnormality detection. Results on four large public datasets on retinopathy and cancer, compared against a variety of contemporary methods, show that our method gives benefits over the state of the art in one-class classification for abnormality detection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call