Students possess varying abilities in problem-solving, ranging from adeptness to struggles or even inability. This study employs a descriptive content analysis design to scrutinize official documents containing practice exam questions. Data collection involved documentation study, with subsequent analysis utilizing validated assessment sheets and rubrics endorsed by experts. The descriptive method facilitated the analysis and interpretation of data meaning, while quantitative analysis processed research data. The study focused on mathematics exam practice scores, revealing several problematic questions. Out of 25 questions, 11 were deemed acceptable, 8 required correction, and 6 were rejected, necessitating replacement. Questions requiring correction or replacement exhibited a low question discriminating power index, failing to effectively gauge students' proficiency in practice exams. Moreover, the majority of practice test questions posed considerable difficulty for students. Out of the 25 questions, students encountered 1 very difficult, 10 difficult, 10 moderates, and only 4 easy questions, which they struggled to solve adequately. Consequently, the findings underscore the importance of revising practice exam questions to ensure they accurately assess students' abilities and cater to a diverse range of skill levels.