Holistic Discourse Coherence Annotation for Noisy Essay Writing

Jill Burstein, Joel Tetreault, Martin Chodorow


This paper reviews annotation schemes used for labeling discourse coherence in well-formed and noisy (essay) data, and it describes a system that we have developed for automated holistic scoring of essay coherence. We review previous, related work on unsupervised computational approaches to evaluating discourse coherence and focus on a taxonomy of discourse coherence schemes classified by their different goals and types of data. We illustrate how a holistic approach can be successfully used to build systems for noisy essay data, across domains and populations. We discuss the model features related to human scoring guide criteria for essay scoring, and the importance of using model features relevant to these criteria for the purpose of generating meaningful scores and feedback for students and test-takers. To demonstrate the effectiveness of a holistic annotation scheme, we present results of system evaluations.

Full Text:


www.dialogue-and-discourse.orgISSN: 2152-9620   Journal doi: 10.5087/dad