Despite large incentives, correctness in software remains an elusive goal. Declarative programming techniques, where algorithms are derived from a specification of the desired behavior, offer hope to address this problem, since there is a combinatorial reduction in complexity in programming in terms of specifications instead of algorithms, and arbitrary desired properties can be expressed and enforced in specifications directly.However, limitations on performance have prevented programming with declarative specifications from becoming a mainstream technique for general-purpose programming, because a strategy which is both efficient and fully general to derive algorithms from specifications does not yet exist. To address this bottleneck, I propose information-gain computation, a framework where an adaptive evaluation strategy is used to efficiently perform a search which derives algorithms that provide information about a query via the most efficient routes. Within this framework, opportunities to compress the search space present themselves, which suggest that information-theoretic bounds on the performance of such a system might be articulated and a system might be designed to achieve them. The computation of the information measures that are the basis of this strategy crucially depends on a probabilistic semantics for the relations represented by predicates, which may either already be present in a probabilistic logic language, or may be superimposed on a pure logic language.I describe a prototype implementation of Fifth, a system that implements these techniques, and a preliminary empirical study of adaptive evaluation for a simple test program. In the test, the evaluation strategy adapts successfully to efficiently evaluate a query with pathological features that would prevent its evaluation by standard general-purpose strategies.