Abstract

Bayesian inference techniques have been applied to the analysis of fluctuation of post-synaptic potentials in the hippocampus. The underlying statistical model assumes that the varying synaptic signals are characterized by mixtures of (unknown) numbers of individual gaussian, or normal, component distributions. Each solution consists of a group of individual components with unique mean values and relative probabilities of occurrence and a predictive probability density. The advantages of bayesian inference techniques over the alternative method of maximum likelihood estimation (MLE) of the parameters of an unknown mixture distribution include the following: (1) prior information may be incorporated in the estimation of model parameters; (2) conditional probability estimates of the number of individual components in the mixture are calculated; (3) flexibility exists in the extent to which the estimated noise standard deviation indicates the width of each component; (4) posterior distributions for component means are calculated, including measures of uncertainty about the means; and (5) probability density functions of the component distributions and the overall mixture distribution are estimated in relation to the raw grouped data, together with measures of uncertainty about these estimates. This expository report describes this novel approach to the unconstrained identification of components within a mixture, and provides demonstration of the usefulness of the technique in the context of both simulations and the analysis of distributions of synaptic potential signals.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call