1 - Overview
Published online by Cambridge University Press: 13 January 2010
Summary
In the process of doing scientific computations we always rely on some information. In practice, this information is typically noisy, i.e., contaminated by error. Sources of noise include
previous computations,
inexact measurements,
transmission errors,
arithmetic limitations,
an adversary's lies.
Problems with noisy information have always attracted considerable attention from researchers in many different scientific fields, e.g., statisticians, engineers, control theorists, economists, applied mathematicians. There is also a vast literature, especially in statistics, where noisy information is analyzed from different perspectives.
In this monograph, noisy information is studied in the context of the computational complexity of solving mathematical problems.
Computational complexity focuses on the intrinsic difficulty of problems as measured by the minimal amount of time, memory, or elementary operations necessary to solve them. Information-based complexity (IBC) is a branch of computational complexity that deals with problems for which the available information is
partial,
noisy,
priced.
Information being partial means that the problem is not uniquely determined by the given information. Information is noisy since it may be contaminated by error. Information is priced since we must pay for getting it. These assumptions distinguish IBC from combinatorial complexity, where information is complete, exact, and free.
Since information about the problem is partial and noisy, only approximate solutions are possible. Approximations are obtained by algorithms that use this information.
- Type
- Chapter
- Information
- Noisy Information and Computational Complexity , pp. 1 - 4Publisher: Cambridge University PressPrint publication year: 1996