Abstract

Feature allocation models generalize classical species sampling models by allowing every observation to belong to more than one species, now called features. Under the popular Bernoulli product model for feature allocation, we assume $n$ observable samples and we consider the problem of estimating the expected number $M_{n}$ of hitherto unseen features that would be observed if one additional individual was sampled. The interest in estimating $M_{n}$ is motivated by numerous applied problems where the sampling procedure is expensive, in terms of time and/or financial resources allocated, and further samples can be only motivated by the possibility of recording new unobserved features. We consider a nonparametric estimator $\hat{M}_{n}$ of $M_{n}$ which has the same analytic form of the popular Good-Turing estimator of the missing mass in the context of species sampling models. We show that $\hat{M}_{n}$ admits a natural interpretation both as a jackknife estimator and as a nonparametric empirical Bayes estimator. Furthermore, we give provable guarantees for the performance of $\hat{M}_{n}$ in terms of minimax rate optimality, and we provide with an interesting connection between $\hat{M}_{n}$ and the Good-Turing estimator for species sampling. Finally, we derive non-asymptotic confidence intervals for $\hat{M}_{n}$, which are easily computable and do not rely on any asymptotic approximation. Our approach is illustrated with synthetic data and SNP data from the ENCODE sequencing genome project.

Highlights

  • Feature allocation models generalize classical species sampling models by allowing every observation to belong to more than one species, called features.In particular, every observation is endowed with a finite set of features selected from a collection of features (Fj)j≥1

  • We show that Mn admits a natural interpretation both as a jackknife estimator (Quenouille [21] and Tukey [25]) and as a nonparametric empirical Bayes estimator in the sense of Efron and Morris [9]

  • The Good-Turing estimator first appeared in Good [10] as a nonparametric empirical Bayes estimator under the classical multinomial model for species sampling, i.e., (Y1, . . . , Yn) are n random samples from a population of individuals belonging to a collection of species (Sj)j≥1 with unknown proportionsj≥1 such that j≥1 pj = 1

Read more

Summary

Introduction

Feature allocation models generalize classical species sampling models by allowing every observation to belong to more than one species, called features. The Bernoulli product model, or binary independence model, is arguably the most popular feature allocation model It models the i-th observation as a sequence Yi = (Yi,j)j≥1 of independent Bernoulli random variables with unknown success probabilities (pj)j≥1, with the assumption that Yr is independent of Ys for any r = s. The Beta prior distribution is a reasonable assumption for neutrally evolving variants but may not be appropriate for deleterious mutations To overcome this drawback, a nonparametric approach to estimate Mn has been proposed in the recent work of Zou et al [28]. Our work delves into the Good-Turing estimator for feature allocation models, providing theoretical guarantees for its use.

A Good-Turing estimator for Mn
Interpretations of Mn
Optimality of Mn
Connection to the Good-Turing estimator for species sampling models
A confidence interval for Mn
A stopping rule for the discovery process
Numerical illustration
Concluding remarks
Nonparametric empirical Bayes

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.