2002 Reports

# An Overview of Information-Based Complexity

Computational complexity has two goals: finding the inherent cost of some problem, and finding optimal algorithms for solving this problem. Information-based complexity (IBC) studies the complexity of problems for which the available information is partial, contaminated by error, and priced. Examples of such problems include integration, approximation, ordinary and partial differential equations, integral equations, and the solution of nonlinear problems such as root-finding and optimization. In this talk, we give a brief overview of IBC. We focus mainly on the integration problem (which is a simple, yet important, problem that can be used to illustrate the main ideas of IBC) and the approximation problem (which will be of most interest to specialists in learning theory). One important issue that we discuss is the "curse of dimension" -- the depressing fact that the worst case complexity of many problems depends exponentially on dimension, rendering them intractable. We explore IBC-based techniques for vanquishing the curse of dimension. In particular, we find that randomization beats intractability for the integration problem but not for the approximation problem; on the other hand, both these problems are tractable in the average case setting under a Wiener sheet measure.

## Subjects

## Files

- cucs-022-02.pdf application/pdf 107 KB Download File

## More About This Work

- Academic Units
- Computer Science
- Publisher
- Department of Computer Science, Columbia University
- Series
- Columbia University Computer Science Technical Reports, CUCS-022-02
- Published Here
- April 21, 2011