Abstract

The outbreak of the COVID-19 pandemic has transformed the educational landscape in a way unseen before. Educational institutions are navigating between offline and online learning worldwide. Computer-based testing is rapidly taking over paper-and-pencil testing as the dominant mode of assessment. In some settings, computer-based and paper-and-pencil assessments can also be offered side-by-side, in which case test developers should ensure the evidence of equivalence between both versions. This study aims to establish the equivalency evidence of different delivery modes of the English Competency Test, an English language assessment for civil service officers developed and used by the Human Resources Development Education and Training Center, a civil service training institution under the Ministry of Finance of the Republic of Indonesia. Psychometric analyses were carried out with the Rasch model to measure the unidimensionality, reliability, separation, and standard error of measurement estimates. The findings demonstrate that the paper-and-pencil and computer-based versions of the language assessment exhibit comparatively equivalent psychometric properties. The computer-based version of the English Competency Test is proven to offer a reliable and comparable alternative to the paper-and-pencil version.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.