Abstract

ABSTRACTGrammar assessment has long been a pivotal and regular component of almost all high-stakes English tests in China. In writing grammar items, however, test developers tend to depend largely on their individual knowledge of grammar, or even intuition, without much reference to the typicality or de facto uses of the language. To address this problem in the Chinese assessment context, we demonstrate the benefits of making use of language corpora in developing and validating grammar tests. The present study adopted a corpus-based approach to investigating the content validity of the grammar section of a high-stakes test, the National Matriculation English Test (Shanghai). It was found that the grammar section under study has fairly appropriate content coverage and relevance, which suggests the test content generally assesses the intended grammatical features. In addition, evidence of content significance was present through a strong linkage between the grammar items in the test and grammatical errors produced by learners who had previously taken the test. However, flaws were detected for content typicality, because certain grammar items failed to conform closely to native English speakers’ typical output. On the basis of these results, a procedure is proposed for conducting corpus-based content validation for high-stakes grammar tests.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.