The multigroup neutron transport criticality calculations using modern supercomputers have been widely employed in a nuclear reactor analysis for studying whether or not a system is self-sustaining. However, the design and development of efficient parallel algorithms for the transport criticality calculations is challenging especially when the number of processor cores is large and an unstructured mesh is adopted. In particular, both the compute time and the memory usage have to be carefully taken into consideration due to the dimensionality of the neutron transport equations. In this paper, we study a monolithic multilevel Schwarz preconditioner for the transport criticality calculations based on a nonlinear diffusion acceleration (NDA) method. In NDA, the multigroup nonlinear diffusion equations are computed using an inexact Jacobian-free Newton method with an initial guess generated from a few inverse power iterations. The computed scalar fluxes and eigenvalue are used to evaluate the fission and scattering terms of the transport equations, and then the nonlinear system of transport equations is simplified to a linear system of equations. The linear systems of equations arising from the discretizations of the nonlinear diffusion equations and the transport equations need to be efficiently solved. We propose a monolithic multilevel Schwarz method that is capable of efficiently handling the systems of linear equations for both the transport system and the diffusion system. However, in the multilevel method, algebraically constructing coarse spaces is expensive and often unscalable. We study a subspace-based coarsening algorithm to address such a challenge by exploring the matrix structures of the transport equations and the nonlinear diffusion equations. We numerically demonstrate that the monolithic multilevel preconditioner with the subspace-based coarsening algorithm is twice as fast as that equipped with an unmodified coarsening approach on thousands of processor cores for an unstructured mesh neutron transport problem with billions of unknowns.
Read full abstract