Abstract

The purpose of this study was to determine if there were differences in postsecondary marketing student performance on essay tests based on test format (i.e., computer-based or handwritten). Specifically, the variables of performance, test completion time, and gender were explored for differences based on essay test format. Results of the study indicate that there was no significant difference in postsecondary marketing student scores based on test format. There was, however, a significant difference in test completion time based on essay format. Postsecondary marketing students completed the computer-based essay test significantly faster than they did the handwritten essay test. Implications for postsecondary marketing educator are also discussed.

Highlights

  • The use of computer-based tests for assessing students has a long, established history

  • Research Question One Research question one sought to determine if there was a significant difference in postsecondary marketing student scores on an essay test based on format

  • A post hoc ANOVA analysis for differences on test scores F(1, 127) = 0.676, p = 0.413 indicated that there was not a significant difference in postsecondary marketing student test scores based on format

Read more

Summary

Introduction

The use of computer-based tests for assessing students has a long, established history. The use of these tests is likely influenced by many reported advantages. Goldberg and Pedulla (2002) supported this idea when they stated that “Moves toward computerized testing stem from the advantages it offers over the traditional paper-and-pencil format” Despite the many advantages of using computer-based tests, there is a concern in the literature regarding student performance equivalence when compared with traditional paper and pencil tests. To further compound these mixed results, “Most of the literature regarding computer-based testing has focused on student performance on computer-based tests with objective type questions” To further compound these mixed results, “Most of the literature regarding computer-based testing has focused on student performance on computer-based tests with objective type questions” (Truell & Davis, 2003, p. 29). Lee (2002) supported this contention by stating “. . . there has been a growing interest in the equivalence of computerized and paper-andpencil multiple choice tests; little attention has been paid to open-ended tests such as writing assessments” (p. 136)

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.