Abstract

Despite arriving with the required language qualifications, many international students struggle with the linguistic demands of a university degree. Using the International English Language Testing System (IELTS) as an example, this study explored how short but intensive preparation programmes may affect high-stakes English-proficiency test scores with which students apply for university places. The participants were 89 Chinese speakers of English as a foreign language in Shanghai. They were tested twice, four weeks apart, on IELTS and three other measures of English ability: The Oxford Online Placement Test, a vocabulary test, and the speed and accuracy of sentence comprehension. Between the two testing points, 45 participants underwent testspecific training consisting of previous IELTS papers, offered by a large test-preparation establishment with a network of over 1,000 training centres. The remaining 44 participants did not engage in any test preparation at the time. Teaching to the test led to a half a band rise in IELTS scores above the gain from test repetition alone, suggesting that the training was effective. Importantly, the IELTS gain did not generalise to the other measures of English ability; the groups performed similarly on all other language tests at both times. This suggests that test-specific, curriculum-narrowing courses could be inflating the scores with which international students apply for university places, with important consequences for test-developers, universities and students.

Highlights

  • In a bid to help their clients achieve the required scores on high-stakes exams in a time-efficient way, many language schools and training centres offer dedicated test-preparation programmes. Do such programmes work? Do they reliably improve scores, and are the scores improved in this way trustworthy? In the context where many international students struggle with the linguistic demands of their programmes (Murray, 2010) and where as a group they experience lower academic success than home students (Morrison et al, 2005), the question of whether, and to what extent, the language test-preparation industry may be subverting the validity of scores is an important one to address

  • Both groups saw some gain in International English Language Testing System (IELTS) scores from Time 1 (T1) to Time 2 (T2).3 The coached group’s median scores rose by half a band: The Listening, Reading and the Overall scores moved from 6.0 to 6.5, and the Writing and Speaking scores moved from 5.5 to 6.0 (Table 3)

  • Using IELTS as an example, our findings indicate that short but intensive curriculum-narrowing courses can reliably improve scores in high-stakes language-proficiency tests, but that such test-specific gains do not readily generalise to other measures of English ability

Read more

Summary

Introduction

In the context where many international students struggle with the linguistic demands of their programmes (Murray, 2010) and where as a group they experience lower academic success than home students (Morrison et al, 2005), the question of whether, and to what extent, the language test-preparation industry may be subverting the validity of scores is an important one to address. We explore this question in the context of China, a country which currently sends the largest number of students abroad and has a wellestablished test-preparation industry. Because the capacity to acquire new knowledge depends on proficiency in the language of instruction (Elder et al, 2007; Daller & Phelan, 2013; Trenkic & Warmington, 2019), most universities require international students to demonstrate their linguistic readiness to study on one of the approved language-proficiency tests

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call