From the celebrated Gaussian mixture, model averaging estimators to the cutting-edge multi-Bernoulli mixture of various forms, finite mixture models offer a fundamental and flexible means to deal with uncertainties arisen in the estimation and learning realm. In the context of recursive estimation, both the uncertainties due to model maneuvering and multi-target data association derive a need for representing the single/multiple-target probability distribution by a finite mixture described by a number of parameters that are iteratively updated over time in the framework of Bayesian inference, which we call generally the Bayesian mixture (BM) filtering. In addition, density fusion between netted agents/sensors may be addressed by linearly combining their local estimations which are often correlated with each other in a complicated means, leading to a fused mixture too. In either case, the final estimate is given by the arithmetic average (AA) of all the weighted components in the mixture, regarding either a single target or multiple targets. Along with this are versatile mixture adaption and optimization strategies which aim to use a small number of components for the best fit to the target distribution. This survey provides a comprehensive overview of representative single/multiple-target BM filters and fusion approaches with the use of either a single sensor or multiple sensors in a unified and coherent fashion. State-of-the-art, relevant adaption and optimization technologies and remaining challenges are discussed. All strive to gain more insights of the approach.