1994 Chapters (Layout Features)
Competitively Evolving Decision Trees Against Fixed Training Cases for Natural Language Processing
Competitive fitness functions can generate performance superior to absolute fitness functions [Angelineand Pollack 1993], [Hillis 1992]. This chapter describes a method by which competition can be implemented when training over a fixed (static) set of examples. Since new training cases cannot be generated by mutation or crossover, the probabilistic frequencies by which individual training cases are selected competitively adapt. We evolve decision trees for the problem of word sense disambiguation. The decision trees contain embedded bit strings; bit string crossover is intermingled with subtree-swapping. To approach the problem of overlearning, we have implemented a fitness penalty function specialized for decision trees which is dependent on the partition of the set of training cases implied by a decision tree.
Subjects
Files
- siegel_94.pdf application/pdf 86.5 KB Download File
Also Published In
- Title
- Advances in Genetic Programming
- Publisher
- The MIT Press
More About This Work
- Academic Units
- Computer Science
- Published Here
- April 26, 2013