Generalizability of writing scores: an application of structural equation modeling
| Authors | |
|---|---|
| Publication date | 2005 |
| Journal | Language Testing |
| Volume | Issue number | 22 | 1 |
| Pages (from-to) | 1-30 |
| Number of pages | 30 |
| Organisations |
|
| Abstract |
The assessment of writing ability is notoriously difficult. Different facets of the assessment seem to influence its outcome. Besides the writer’s writing proficiency, the topic of the assignment, the features or traits scored (e.g., content or language use) and even the way in which these traits are scored (e.g., holistically or analytically) may contribute to the writer’s score. In this article, the effect of these facets is estimated in a generalizability study using variance analytic techniques. A structural equation modeling (SEM) approach is used to estimate the variance components in the writing scores. Eighty-nine grade 6 students (aged 11-12 years) wrote four essays, each of which was scored by five raters using two scoring methods (i.e., holistically and analytically) for two traits (i.e., Content and Organization, and Language Use). Analyses of these ratings showed that the generalizability of writing scores and the effects of raters and topics are very much dependent on the way the essays are scored and the trait that is scored. The overall picture is that writing tasks contribute more to the score variance than raters do.
|
| Document type | Article |
| Language | English |
| Published at | https://doi.org/10.1191/0265532205lt295oa |
| Permalink to this page | |