2005 Presentations (Communicative Events)
Do Summaries Help? A Task-Based Evaluation of Multi-Document Summarization
We describe a task-based evaluation to determine whether multi-document summaries measurably improve user performance when using online news browsing systems for directed research. We evaluated the multi-document summaries generated by Newsblaster, a robust news browsing system that clusters online news articles and summarizes multiple articles on each event. Four groups of subjects were asked to perform the same time-restricted fact-gathering tasks, reading news under different conditions: no summaries at all, single sentence summaries drawn from one of the articles, Newsblaster multi-document summaries, and human summaries. Our results show that, in comparison to source documents only, the quality of reports assembled using Newsblaster summaries was significantly better and user satisfaction was higher with both Newsblaster and human summaries.
Subjects
Files
- mckeown_al_05b.pdf application/pdf 1.81 MB Download File
More About This Work
- Academic Units
- Computer Science
- Publisher
- Proceedings of ICASSP, Special Session on Human Language Technology Applications and Challenges for Speech Processing, 2005
- Published Here
- June 12, 2013