Turbulent combustion modeling often faces a trade-off between the so-called flamelet-like models and PDF-like models. Flamelet-like models, are characterized by a choice of a limited set of prescribed moments, which are transported to represent the manifold of the composition space and its statistics. PDF-like approaches are designed to directly evaluate the closure terms associated with the nonlinear chemical source terms in the energy and species equations. They generate data on the fly, which can be used to accelerate the simulation of PDF-like based models. Establishing key ingredients for implementing acceleration schemes for PDF-like methods by constructing flamelet-like models on the fly can potentially result in computational saving while maintaining the ability to resolve closure terms. These ingredients are investigated in this study. They include a data-based dimensional reduction of the composition space to a low-dimensional manifold using principal component analysis (PCA). The principal components (PCs) serve as moments, which characterize the manifold; and conditional means of the thermo-chemical scalars are evaluated in terms of these PCs. A second ingredient involves adapting a novel deep learning framework, DeepONet, to construct joint PCs’ PDFs as alternative methods to presumed shapes common in flamelet-like approaches. We also investigate whether the rotation of the PCs into independent components (ICs) can improve their statistical independence. The combination of these ingredients is investigated using experimental data based on the Sydney turbulent nonpremixed flames with inhomogeneous inlets. The combination of constructed PDFs and conditional mean models are able to adequately reproduce unconditional statistics of thermo-chemical scalars, and establish acceptable statistical independence between the PCs, which simplify further the modeling of the joint PCs’ PDFs.