Abstract

To date, the role of gender in speaking tests has received limited attention in language testing research. It is possible in oral interviews, for instance, that both interviewing and rating may be highly gendered processes. In tests like the IELTS interview, where the interviewer also acts as the rater, this poses the question of whether a gender effect, if it exists, stems from the interview itself, the rating decision or a combination of both these ‘events’. The data collected for this study consisted of the audio-taped performances of 8 female and 8 male test-takers who undertook a practice IELTS interview on two different occasions, once with a female interviewer and once with a male interviewer. The interviews were transcribed and analysed in relation to previously identified features of gendered language use, namely overlaps, interruptions and minimal responses. The scores later assigned by 4 raters (2 males and 2 females) to each of the 32 interviews were also examined in relation to the gender of both raters and test-takers using multi-faceted Rasch bias analyses. The results from both the discourse and test score analyses indicated that gender did not have a significant impact on the IELTS interview. These findings are interpreted in relation to more recent thinking about gender in language use.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call