Abstract

We investigate a class of DNA mixture deconvolution algorithms based on variational inference, and we show that this can significantly reduce computational runtimes with little or no effect on the accuracy and precision of the result. In particular, we consider Stein Variational Gradient Descent (SVGD) and Variational Inference (VI) with an evidence lower-bound objective. Both provide alternatives to the commonly used Markov-Chain Monte-Carlo methods for estimating the model posterior in Bayesian probabilistic genotyping. We demonstrate that both SVGD and VI significantly reduce computational costs over the current state of the art. Importantly, VI does so without sacrificing precision or accuracy, presenting an overall improvement over previously published methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call