The recently developed partially confirmatory factor analysis (PCFA) framework offers a promising solution that effectively accommodates varying levels of available knowledge, blending the strengths of both data- and theory-driven approaches. Nevertheless, its reliance on Markov Chain Monte Carlo (MCMC) techniques for parameter estimation often comes with challenges due to its computationally intensive nature and issues related to convergence. To address that, this study introduces a novel and compelling alternative - the regularized variational approximation approach within the PCFA framework (PCFA-VA). The proposed method integrates regularization techniques and converts the classical Bayesian inference problem into an optimization problem by approximating the posterior distribution with a predefined family of distributions and employing Kullback–Leibler (KL) divergence to quantify such differences. In this sense, PCFA-VA avoids the complex integration problem required to obtain the exact posterior distribution and results in substantial computational savings. Based on the simulated and real data analysis, PCFA-VA has demonstrated the potential to achieve considerable accuracy while maintaining computational efficiency, making it scalable for large-scale problems.
Read full abstract