Interactive activation models of lexical access assume that the presentation of a given word activates not only its lexical representation but also those corresponding to words similar in form. Current theories are based on data from oral and written languages, and therefore signed languages represent a special challenge for existing theories of word recognition and lexical access since they allow us to question what the genuine fundamentals of human language are and what might be modality-specific adaptation. The aim of the present study is to determine the electrophysiological correlates and time course of phonological processing of Spanish Sign Language (LSE). Ten deaf native LSE signers and ten deaf non-native but highly proficient LSE signers participated in the experiment. We used the ERP methodology and form-based priming in the context of a delayed lexical decision task, manipulating phonological overlap (i.e. related prime-target pairs shared either handshape or location parameters). Results showed that both parameters under study modulated brain responses to the stimuli in different time windows. Phonological priming of location resulted in a higher amplitude of the N400 component (300–500ms window) for signs but not for non-signs. This effect may be explained in terms of initial competition among candidates. Moreover, the fact that a higher amplitude N400 for related pairs was found for signs but not for non-signs points to an effect at the lexical level. Handshape overlap produced a later effect (600–800ms window). In this window, a more negative-going wave for the related condition than for the unrelated condition was found for non-signs in the native signers group. The findings are discussed in relation to current models of lexical access and word recognition. Finally, differences between native and non-native signers point to a less efficient use of phonological information among the non-native signers.