The increase in data variability and quantity makes it urgent for learning complex multivariate probability distributions. The state-of-the-art Tree Augmented Naive Bayes (TAN) classifier uses maximum weighted spanning tree (MWST) to graphically model data with excellent time and space complexity. In this paper, we theoretically prove the feasibility of scaling up one-dependence MWST to model high-dependence relationships, and then propose to apply a heuristic search strategy to improve the fitness of extended topology to data. The newly added edges to each attribute node may provide a local optimal solution. Then ensemble learning is introduced to improve the generalization performance and reduce the sensitivity to variation in training data. The experimental results on 32 benchmark datasets reveal that this highly scalable algorithm inherits the expressive power of TAN and achieves an excellent bias–variance tradeoff, and it also demonstrates competitive classification performance when compared to a range of high-dependence or ensemble learning algorithms.