Abstract

A novel framework for the design and analysis of energy-aware algorithms is presented, centered around a deterministic Bit-level (Boltzmann) Random Access Machine or BRAM model of computing, as well its probabilistic counterpart, the RABRAM. Using this framework, it is shown for the first time that probabilistic algorithms can asymptotically yield savings in the energy consumed, over their deterministic counterparts. Concretely, we show that the expected energy savings derived from a probabilistic RABRAM algorithm for solving the distinct vector problem introduced here, over any deterministic BRAM algorithm grows as \(\Theta \left( {n\ln \left( {\frac{n} {{n - \varepsilon \log \left( n \right)}}} \right)} \right)\), even though the deterministic and probabilistic algorithms have the same (asymptotic) time-complexity. The probabilistic algorithm is guaranteed to be correct with a probability \(p \geqslant \left( {1 - \frac{1} {{n^c }}} \right)\) (for a constant c chosen as a design parameter). As usual n denotes the length of the input instance of the DVP measured in the number of bits. These results are derived in the context of a technology-independent complexity measure for energy consumption introduced here, referred to as logical work. In keeping with the theme of the symposium, the introduction to this work is presented in the context of “computational proof” (algorithm) and the “work done” to achieve it (its energy consumption).KeywordsTransition FunctionTuring MachineProbability ParameterInput SymbolProbabilistic AlgorithmThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call