Abstract

We propose the attraction Indian buffet distribution (AIBD), a distribution for binary feature matrices influenced by pairwise similarity information. Binary feature matrices are used in Bayesian models to uncover latent variables (i.e., features) that explain observed data. The Indian buffet process (IBP) is a popular exchangeable prior distribution for latent feature matrices. In the presence of additional information, however, the exchangeability assumption is not reasonable or desirable. The AIBD can incorporate pairwise similarity information, yet it preserves many properties of the IBP, including the distribution of the total number of features. Thus, much of the interpretation and intuition that one has for the IBP directly carries over to the AIBD. A temperature parameter controls the degree to which the similarity information affects feature-sharing between observations. Unlike other nonexchangeable distributions for feature allocations, the probability mass function of the AIBD has a tractable normalizing constant, making posterior inference on hyperparameters straight-forward using standard MCMC methods. A novel posterior sampling algorithm is proposed for the IBP and the AIBD. We demonstrate the feasibility of the AIBD as a prior distribution in feature allocation models and compare the performance of competing methods in simulations and an application.

Highlights

  • Two primary functions for data modeling are to relate observed data to each other and to future observations

  • In this paper we propose a generalization of the Indian buffet process (IBP) which incorporates pairwise distances into the feature allocation prior, namely the attraction Indian buffet distribution (AIBD)

  • We propose a generalization of the IBP, the attraction Indian buffet distribution (AIBD)

Read more

Summary

Introduction

Two primary functions for data modeling are to relate observed data to each other and to future observations. When modeling, we assume that the observed data are somehow interconnected and possess some information about future observations. To account for distance information between observations, Gershman et al (2015) developed the distance dependent Indian buffet process (dd-IBP) This method allows a modeler to indicate, a priori, the distances between each pair of observations. In this paper we propose a generalization of the IBP which incorporates pairwise distances into the feature allocation prior, namely the attraction Indian buffet distribution (AIBD). The AIBD has a tractable probability mass function (pmf) which readily allows for standard MCMC techniques on hyperparameters Another property of the AIBD is that the expected number of shared features between two customers can increase or decease (in relation to the IBP).

Literature Review
The Chinese Restaurant Process
The Indian Buffet Process
The Attraction Indian Buffet Distribution
Properties of the AIBD
Distribution of the Number of Features
Expected Number of Shared Features
Comparison of the AIBD’s and the dd-IBP’s Properties
The Similarity Function
Lack of Marginal Invariance
AIBD Posterior Sampling
Posterior Sampling of the Feature Allocation Z
Sampling the Other Parameters
Data Analysis
The Data
The Analysis
Comparison Between the AIBD and the dd-IBP
Comparison Between the AIBD and the IBP
Items for Consideration
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call