Language and action have been thought of as closely related. Comprehending words or phrases that are related to actions commonly activates motor and premotor areas, and this comprehension process interacts with action preparation and/or execution. However, it remains unclear whether comprehending action-related language interacts with action observation. In the current study, we examined whether the observation of tool-use gesture subjects to interaction with language. In an electroencephalography (EEG) study (n = 20), participants were presented with video clips of an actor performing tool-use (TU, e.g., hammering with a fist) and emblematic (EM, e.g., the thumb up sign for 'good job') gestures accompanied by either comprehensible German (G) or incomprehensible Russian sentences (R). Participants performed a semantic judging task, evaluating whether the co-speech gestures were object- or socially-related. Behavioral results from the semantic task showed faster response for the TU versus EM gestures only in the German condition. For EEG, we found that TU elicited beta power decrease (~ 20Hz) when compared to EM gestures, however this effect was reduced when gestures were accompanied by German instead of Russian sentences. We concluded that the processing of action-related sentences might facilitate gesture observation, in the sense that motor simulation required for TU gestures, as indexed by reduced beta power, was modulated when accompanied by comprehensible German speech. Our results corroborate the functional role of the beta oscillations during perception of hand gestures, and provide novel evidence concerning language-motor interaction.