Remote sensing is gradually playing an important role in the detection of ground information. However, the quality of remote-sensing images has always suffered from unexpected natural conditions, such as intense haze phenomenon. Recently, convolutional neural networks (CNNs) have been applied to deal with dehazing problems, and some important findings have been obtained. Unfortunately, the performance of these classical CNN-based methods still needs further enhancement owing to their limited feature extraction capability. As a critical branch of CNNs, the generative adversarial network (GAN), composed of a generator and discriminator, has become a hot research topic and is considered a feasible approach to solving the dehazing problems. In this study, a novel dehazed generative adversarial network (GAN) is proposed to reconstruct the clean images from the hazy ones. For the generator network of the proposed GAN, the color and luminance feature extraction module and the high-frequency feature extraction module aim to extract multi-scale features and color space characteristics, which help the network to acquire texture, color, and luminance information. Meanwhile, a color loss function based on hue saturation value (HSV) is also proposed to enhance the performance in color recovery. For the discriminator network, a parallel structure is designed to enhance the extraction of texture and background information. Synthetic and real hazy images are used to check the performance of the proposed method. The experimental results demonstrate that the performance can significantly improve the image quality with a significant increment in peak-signal-to-noise ratio (PSNR). Compared with other popular methods, the dehazing results of the proposed method closely resemble haze-free images.