Abstract

We introduce a formal framework to study the time and space complexity of computing with faulty memory. For the fault-free case, time and space complexities were studied using the “pebbling game” model. We extend this model to the faulty case, where the content of memory cells may be erased. The model captures notions such as “check points” (keeping multiple copies of intermediate results), and “recovery” (partial recomputing in the case of failure). Using this model, we derive tight bounds on the time and/or space overhead inflicted by faults. As a lower bound, we exhibit cases where f worst-case faults may necessitate an Ω(f) multiplicative factor overhead in computation resources (time, space, or their product). The lower bound holds regardless of the computing and recomputing strategy employed. A matching upper-bound algorithm establishes that an O (f) multiplicative overhead always suffices. For the special class of binary tree computations, we show that f faults necessitates only Θ(f) additive factor in space.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call