Abstract

Automatic Programming Assessment (APA) has recently become a significant method in assisting educators of programming courses to automatically mark and grade students’ programming as its counterpart; the typical manual tasks are prone to errors and leading to inconsistency. By default, test data generation process plays an important role to perform a dynamic testing on students’ programs. In software testing field, there have been diverse automated methods for test data generation. Unfortunately, APA seldom adopts these methods. Merely limited studies have attempted to integrate APA and test data generation to include more useful features and to provide a precise and thorough quality of program testing. Thus, we propose a framework of test data generation so-called FaSt-Gen to cover both the functional and structural testing of a program for APA. It aims to assist the lecturers of programming courses to furnish an adequate set of test data to assess students’ programming solutions regardless of having the concrete expertise in the particular knowledge of test cases design. FaStGen integrates the positive and negative testing criteria (or reliable and valid test adequacy criteria) to derive desired test data and test set schema. The findings from the conducted experiment depict that FaSt-Gen improves the reliability and validity test data adequacy in programming assessments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call