In this paper, we propose a new approach to solve the radiative transfer equation (RTE) and determine the path loss for line-of-sight (LOS) propagation with laser diode sources in underwater wireless optical channels, which severely suffers from attenuation due to inevitable absorption and scattering. The scheme is based on an effective combination of Monte-Carlo (MC) simulation employed for dataset generation and a partially pruned deep neural network (PPDNN) utilized to predict the received optical power. First, a parallel MC algorithm is newly introduced and applied to speed up the dataset-generation process. Compared with the conventional single-step MC, the dataset-generation time of the parallel MC can be reduced by at least 95%. Meanwhile, a deep neural network (DNN) is partially pruned to acquire a compact structure and adopted to predict the path loss in three typical water types. The simulation results yield that the mean square errors (MSEs) between the predictive and the reference ones are all lower than 0.2, while the sparsity of the original DNN's weights can be appropriately increased to 0.9, 0.7, and 0.5 for clear water, coastal water, and harbor water, respectively. Finally, the occupied storage space of the original DNN can be dramatically compressed by at least 40% with a small performance penalty. In view of this, the received optical power under certain parameters could be instantly obtained by employing the proposed PPDNN, which can effectively help design underwater wireless optical communication systems in future work.