Abstract

<p>Item analysis is a process in which both students' answers and test questions are evaluated in order to determine the quality of the items and the test as a whole in the standardized and objective evaluation of student performances. Evaluation is needed to define how much the participants' learning outcomes have changed from their beginning abilities to their abilities after completing the educational process. This research examines the items' quality from a quantitative standpoint. The aims of this study are to determine the difficulty index, item discrimination, distractor effectiveness, and reliability of the final semester test for Chemistry Subject Class X MIPA SMAN 1 Wonosegoro, Boyolali Regency. The research was carried out at SMAN 1 Wonosegoro, Boyolali Regency, using a quantitative descriptive technique. This study's population consists of response data from all 212 students in class X MIPA Chemistry throughout 2019/2020, 2018/2019 and 2017/2018 academic years. Documentation techniques are used to collect data. The data was analyzed quantitatively with the ANATES 4.0.9 version. According to the findings, in three consecutive academic years, the difficulty index was medium means is good because it is neither too complex nor too simple. Item discrimination is acceptable and meets the standards of sufficient, good, and exceptional. The distractor effectiveness was functions and the reliability value in the Academic Year of 2017/2018 was sufficient at 0.45, but it was high at 0.62 and 0.78 in the Academic Year of 2018/2019 and 2019/2020. The finding of this study item analysis is a crucial process in creating tests. This is about the impact of the accuracy of students' scores on test quality.</p>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call