Emotion, as part of the overall sensorimotor, introspective, and affective system, is an essential part of language comprehension within the framework of embodied semantics. As emotional state influences semantic and syntactic processing, emotional language processing has been shown to modulate mood as well. The reciprocal relationship between language and emotion has also been informative in bilingualism. Here we take a relatively underresearched type of bilingual processing, simultaneous interpreting, as a case of extreme bilingualism and investigate the effect of emotional language rendering in the L1 on subjective affect and prosodic markers of L2 output. 18 trainee interpreters were asked to simultaneously interpret three speeches in Turkish that varied in emotionality, valence (negative, neutral, and positive), and difficulty in English. Responses to emotional language processing were analysed based on participants’ self-reported positive and negative affect using the Positive and Negative Affect Schedule (PANAS) and three prosodic parameters (intensity, pitch, and fluency). Results showed that interpreting emotionally negative speech increased negative affect, whereas interpreting emotionally positive speech did not modify positive affect. Intensity generally reflected cognitive load. Pitch and fluency, in particular, were more sensitive to changes in the valence of the source speech.