Electroencephalogram (EEG)-based emotion recognition is getting a lot of attention in Brain–Computer Interface (BCI) research. However, the recognition model of cross-subject emotions requires effectively represented EEG data to capture diverse features corresponding to the spatial, frequency and temporal of information. In contrast, by calculating Power Spectral Density (PSD), Differential Entropy (DE), and the spaced interpolation algorithm, brain map construction has the ability to enhance the quality of frequency and spatial data features. In addition, DEAP is a dimensional dataset. We used the multi-task learning method while considering the characteristics of the dimensional dataset. Therefore, we proposed a novel hybrid emotion recognition deep learning network called the multi-task hybrid emotion recognition network(MTHNet), which fuses Bi-LSTM(Bidirectional long short-term memory) and improved UNet. In the hybrid network, improved UNet is utilized to extract spatial features. Then Bi-LSTM cells are applied to extract temporal features and memorize the change of sequential data in forward and backward directions. The proposed method is applied to the well-known 5 public databases and has given outperforming results than the current state of the art.