Music is one of the effective media to convey emotion. Emotion recognition in music is the process of identifying the emotion from the music clips. In this paper, a novel approach is proposed to recognize the emotion by classes of musical instruments using deep learning techniques. Music dataset is collected for four instrument types, string, percussion, woodwind, and brass. These instruments types are grouped into four emotions, happy, sad, neutral and fear. The Mel frequency cepstral coefficients (MFCC), Chroma energy normalized statistics (CENS), chroma short time Fourier transform (STFT), spectral features spectral centroid, bandwidth, rolloff, and temporal feature ZCR are extracted from the instrumental music data set. Based on the extracted features recurrent neural networks (RNN) are trained to recognize the emotion. The performance of RNN is compared with the baseline machine learning classification algorithms. The results indicate that MFCC features with deep RNN give better performance for instrument emotion recognition. It also shows that instrument class plays an important role in the emotion induced by the music. Music is one of the effective media to convey emotion. Emotion recognition in music is the process of identifying the emotion from the music clips. In this paper, a novel approach is proposed to recognize the emotion by classes of musical instruments using deep learning techniques. Music dataset is collected for four instrument types, string, percussion, woodwind, and brass. These instruments types are grouped into four emotions, happy, sad, neutral and fear. The Mel frequency cepstral coefficients (MFCC), Chroma energy normalized statistics (CENS), Chroma short time Fourier transform (STFT), spectral features spectral centroid, bandwidth, rolloff, and temporal feature ZCR are extracted from the instrumental music data set. Based on the extracted features recurrent neural networks (RNN) are trained to recognize the emotion. The performance of RNN is compared with the baseline machine learning classification algorithms. The results indicate that MFCC features with deep RNN give better performance for instrument emotion recognition. It also shows that instrument class plays an important role in the emotion induced by the music.