2001 Presentations (Communicative Events)
Evaluation of the DEFINDER System for Fully Automatic Glossary Construction
In this paper we present a quantitative and qualitative evaluation of DEFINDER, a rule-based system that mines consumer-oriented full text articles in order to extract definitions and the terms they define. The quantitative evaluation shows that in terms of precision and recall as measured against human performance, DEFINDER obtained 87% and 75% respectively, thereby revealing the incompleteness of existing resources and the ability of DEFINDER to address these gaps. Our basis for comparison is definitions from on-line dictionaries, including the UMLS Metathesaurus. Qualitative evaluation shows that the definitions extracted by our system are ranked higher in terms of user-centered criteria of usability and readability than are definitions from on-line specialized dictionaries. The output of DEFINDER can be used to enhance these dictionaries. DEFINDER output is being incorporated in a system to clarify technical terms for non-specialist users in understandable non-technical language.
Subjects
Files
- klavans_muresan_01a.pdf application/pdf 42.6 KB Download File
More About This Work
- Academic Units
- Computer Science
- Publisher
- Proceedings of AMIA Symposium 2001
- Published Here
- May 3, 2013