Abstract
Abstract Data refinement is a common approach to reasoning about programs, based on establishing that a concrete program indeed satisfies all the required properties imposed by an intended abstract pattern. Reasoning about programs in this setting becomes complex when use of pointers is assumed and, moreover, a well-known method for proving data refinement, namely the forward simulation method, becomes unsound in presence of pointers. The reason for unsoundness is the failure of the “lifting theorem” for simulations: that a simulation between abstract and concrete modules can be lifted to all client programs. The result is that simulation does not imply that a concrete can replace an abstract module in all contexts. Our diagnosis of this problem is that unsoundness is due to interference from the client programs. Rather than blame a module for the unsoundness of lifting simulations, our analysis places the blame on the client programs which cause the interference: when interference is not present, soundness is recovered. Technically, we present a novel instrumented semantics which is capable of detecting interference between a module and its client. With use of special simulation relations, namely growing relations, and interpreting the simulation method using the instrumented semantics, we obtain a lifting theorem. We then show situations under which simulation does indeed imply refinement.
Published Version (
Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have