Abstract

ABSTRACTThe major objective of the study was to gain more information about the reasoning skills tapped by the GRE analytical measure by examining how performance on its constituent item types relate to alternative criteria. A second objective was to ascertain the extent to which additional information on examinees' analytical skills might be obtained from further analyses of their writing performance. The data base for this study consisted of 406 writing samples, prepared by 203 students who had recently taken the GRE General Test for admission to institutions of higher education in the United States. The bulk of these data were collected for research funded by the GRE Board and TOEFL Policy Council, in which the writing samples were scored to reflect writing skills; these scores were related to scores on the GRE General Test and the TOEFL, as well as other measures. In order to supplement the sample of native speakers, additional writing samples were collected from 77 native speakers of English and 3 native speakers of Arabic who had recently taken the GRE General Test and who were in their first year of graduate education in the United States. The final sample included subsamples of nonnative speakers of English (6 Arabic, 73 Chinese, 35 Spanish) and the subsample of 89 native speakers of English.The objectives of this study were accomplished by developing several scoring methods that focused on the reasoning skills that are reflected in these papers. These scores, in addition to the scores for writing skills, were related to item type subscores derived from the verbal and analytical reasoning sections of the GRE General Test in order to determine if these item types relate differently to judgments of examinees' thinking and writing skills.Three scoring methods did not appear to provide additional information beyond what is obtained from the analytical reasoning and verbal sections of the GRE General Test. The Moss scheme, however, yielded scores that were relatively independent of these sections of the GRE. It is possible that these scores tapped verbal reasoning skills that are not assessed by the GRE General Test, but further research is needed to determine whether they represent important developed abilities. Writer's Workbench computerized text analyses suggested that the different writing tasks elicited different kinds of writing performance, and that the writing performance of students representing different native language groups may vary in complex ways in response to these tasks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call