In this exploratory mixed‐methods study, we introduce and test our AI‐powered vocabulary learning system—ARCHe, which embeds four AI functions: (1) automatic feedback towards for pronunciation, (2) automatic feedback for towards handwriting, (3) automatic scoring for student‐generated sentences and (4) automatic recommendations. Specifically, our study of 140 students taught by six teachers in three primary schools in Singapore explores the links between these AI functions and students' learning engagement and outcomes via the analysis of their pre‐ and post‐tests, post‐surveys, focus group discussions and artefacts created via ARCHe. Results show improved Chinese character and vocabulary test scores after using ARCHe. Students' perceptions of ARCHe automatic recommendations and feedback towards pronunciation positively influence their emotional engagement. Also, students who perceived ARCHe automatic recommendations and feedback on handwriting more favourably than others reported greater cognitive engagement. Meanwhile, students whose groups created more sentences in classroom‐based collaborative learning than others were more likely to show learning gains. This study provides insights for learning designers and educators on AI's potential in language learning, with recommendations for future research directions. Practitioner notesWhat is already known about this topic AI‐enabled automatic feedback or recommendations might improve students' learning engagement, scaffold their learning processes and enhance their learning outcomes. Students' perceived usefulness of a mobile learning system positively influences their learning engagement. Leveraging AI technology and adopting innovative feedback approaches can improve mobile language learning experiences for students of varying needs and preferences. What this paper adds This study introduced and tested a self‐designed AI‐powered vocabulary learning system for young students—ARCHe, which embeds four AI functions (feedback for both pronunciation and handwriting, scoring of sentences and recommendations). Students who perceived ARCHe feedback towards pronunciation or recommendations as more useful than others showed greater emotional engagement. Students who viewed ARCHe feedback towards handwriting as more useful than others wrote sentences with greater complexity during group activities in class. By contrast, those viewing ARCHe recommendations as more useful than others did wrote shorter sentences. Students in groups that wrote more sentences during their class activities were more likely to show learning gains (unlike the non‐significant effects of home‐based individual activities). Implications for practice and/or policy This study contributes to the existing body of knowledge in AI‐enhanced language learning by showcasing how AI can empower mobile‐based vocabulary learning for young students. The study sheds light on specific AI functions that affect language learning engagement. The findings offer specific recommendations for classroom instruction and AI system upgrades and provide insights into the development of online language learning with AI.
Read full abstract