Abstract

Abstract Lexicography is neither a new science nor a new craft. Dictionaries have existed for hundreds of years and have been compiled to meet very practical needs (for example those of travelers who needed lists of words in multiple languages). At the same time, dictionaries can also be seen as cultural artifacts conveying a vision of a community’s language. This explains why, in addition to being commercial objects, dictionaries have also been studied by linguists who find in them a treasure trove of information about language in general and vocabulary in particular. A revival in lexical studies over the past twenty-five years has revolutionized the art of dictionary-making and of dictionary analysis. The advent of computer technology, which makes it possible to manipulate large amounts of textual data and to store and retrieve lexical information in novel ways, has enabled linguists and lexicographers to question assumptions that had been taken for granted for decades. Do word senses really exist, for instance, or are they simply constructs and oversimplifications which come in handy because we tend to work best with clear-cut distinctions and categories that we like to classify into distinct, well-defined boxes? How should we account for polysemy in dictionaries? What kinds of examples are most effective when trying to show how a word is typically used? Should lexicographers invent their own examples or should they rather use real sentences excerpted from large bodies of running texts? How should definitions be structured and written in learners’ dictionaries and in dictionaries for native speakers? Should the definiendum (the word being defined) be included in a full-sentence definition, or should such definitions be reserved for a limited number of cases? What are collocations and how can they be identified in corpora? How should collocations be represented in monolingual and in bilingual dictionaries? How can we speed up the compilation of dictionaries and provide lexicographers with tools enabling them to sort the wheat from the chaff and to identify what is common and typical, instead of being blinded by exceptional, anomalous cases? Now that concordances and KWIC lines (Key Word In Context) are routinely made available to dictionary makers, how can we make sure that the lexicographer is not as powerless as a person standing underneath Niagara Falls, holding a rainwater gauge while the evidence sweeps by in immeasurable torrents, to quote Church et al. (1994: 153)? Now that the World Wide Web enables us to nearly instantaneously access billions of words in dozens of languages, do we still need dictionaries to make sense of words? Do we even still need lexicographers to tap the evidence, extract lexicographically relevant facts, and distill them into dictionaries? And, if we do, how do we make sure that the dictionaries they compile really meet users’ needs? Do we even know how people use dictionaries and what we can do to help them find their way about these resources?

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.