This paper proves correctness of Nöcker's method of strictness analysis, implemented in the Clean compiler, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work Clark, Hankin and Hunt did on the correctness of the abstract reduction rules in two aspects. Our correctness proof is based on a functional core language and a contextual semantics, thus proving a wider range of strictness-based optimizations as correct, and our method fully considers the cycle detection rules, which contribute to the strength of Nöcker's strictness analysis.Our algorithm SAL is a reformulation of Nöcker's strictness analysis algorithm in a functional core language LR. This is a higher order call-by-need lambda calculus with case, constructors, letrec, and seq, which is extended during strictness analysis by set constants like Top or Inf, denoting sets of expressions, which indicate different evaluation demands. It is also possible to define new set constants by recursive equations with a greatest fixpoint semantics. The operational semantics of LR is a small-step semantics. Equality of expressions is defined by a contextual semantics that observes termination of expressions. Basically, SAL is a nontermination checker. The proof of its correctness and hence of Nöcker's strictness analysis is based mainly on an exact analysis of the lengths of evaluations, i.e., normal-order reduction sequences to WHNF. The main measure being the number of “essential” reductions in evaluations.Our tools and results provide new insights into call-by-need lambda calculi, the role of sharing in functional programming languages, and into strictness analysis in general. The correctness result provides a foundation for Nöcker's strictness analysis in Clean, and also for its use in Haskell.
Read full abstract