Abstract

Recent research has reported that adding non-existent diacritical marks to a word produces a minimal reading cost compared to the intact word. Here we examined whether this minimal reading cost is due to: (1) the resilience of letter detectors to the perceptual noise (i.e., the cost should be small and comparable for words and nonwords) or (2) top-down lexical processes that normalize the percept for words (i.e., the cost would be larger for nonwords). We designed a letter detection experiment in which a target stimulus (either a word or a nonword) was presented intact or with extra non-existent diacritics [e.g., amigo (friend) vs. ãmîgô; agimo vs. ãgîmô]. Participants had to decide which of two letters was in the stimulus (e.g., A vs. U). Although the task involved lexical processing, with responses being faster and more accurate for words compared to nonwords, we found only a minimal advantage in error rates for intact stimuli versus those with non-existent diacritics. This advantage was similar for both words and nonwords. The letter detectors in the word recognition system appear to be resilient to non-existent diacritics without the need for feedback from higher levels of processing.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.