Due to the absorption and scattering effects of light propagating through water, underwater images inevitably suffer from severe degradation, such as color casts and losses of detail. Many existing deep learning-based methods have demonstrated superior performance for underwater image enhancement (UIE). However, accurate color correction and detail restoration still present considerable challenges for UIE. In this work, we develop a dual-branch fusion network, dubbed the DBFNet, to eliminate the degradation of underwater images. We first design a triple-color channel separation learning branch (TCSLB), which balances the color distribution of underwater images by learning the independent features of the different channels of the RGB color space. Subsequently, we develop a wavelet domain learning branch (WDLB) and design a discrete wavelet transform-based attention residual dense module to fully employ the wavelet domain information of the image to restore clear details. Finally, a dual attention-based selective fusion module (DASFM) is designed for the adaptive fusion of latent features of the two branches, in which both pleasing colors and diverse details are integrated. Extensive quantitative and qualitative evaluations of synthetic and real-world underwater datasets demonstrate that the proposed DBFNet significantly improves the visual quality and shows superior performance to the compared methods. Furthermore, the ablation experiments demonstrate the effectiveness of each component of the DBFNet.