1989 Reports
Integrated analytical and empirical learning of approximations for intractable theories
A powerful new technique for learning to solve intractable problems is presented in this dissertation. Although computational theories can be formulated for many problem-solving tasks, such theories are often intractable because they require excessive computational resources. This research has investigated a strategy of sacrificing the correctness of theories in order to gain tractability in return. Initially intractable theories are approximated by adopting generic simplifying assumptions. For instance, one generic assumption asserts that the value of a function is the same for all arguments. Another asserts that a random variable is equally likely to manifest any of its legal values. Although such assumptions are not strictly true, they can greatly simplify otherwise intractable computations. They often result in approximate theories with acceptable levels of accuracy and efficiency. A program called "POLLYANNA" has been developed to investigate this approach to the intractable theory problem. POLLYANNA was developed using the card game hearts and a job scheduling problem as test bed domains. This program combines analytic and empirical learning methods in a generate and test framework. During the analytic phase, candidate approximate theories are generated by systematically applying generic simplifying assumptions to an initially intractable theory. Analytic generation results show that many candidates are semantically equivalent to informally stated heuristic decision rules, including the Dump High Rank rule in the hearts domain and the Soonest Scheduling Deadline First rule in the job scheduling domain. During the empirical phase, candidates are tested against teacher-provided training examples. Measurements of accuracy and efficiency are used to guide a search for approximate theories meeting specified accuracy and efficiency goals. Empirical testing results show the candidates to manifest a gradual tradeoff between accuracy and efficiency. Some approximate theories are efficient but highly prone to errors, while others are more accurate but computationally expensive. Depending on the context in which a computational theory is used, different combinations of accuracy and efficiency are appropriate. This research thus demonstrates a flexible approach to approximate theory formation.
Subjects
Files
- cucs-511-89.pdf application/pdf 405 KB Download File
More About This Work
- Academic Units
- Computer Science
- Publisher
- Department of Computer Science, Columbia University
- Series
- Columbia University Computer Science Technical Reports, CUCS-511-89
- Published Here
- April 26, 2011