Abstract

Mining from simulation data of the golden model in hardware design verification is an effective solution to assertion generation. While the simulation data is inherently incomplete, it is necessary to evaluate the truth values of the mined assertions. This paper presents an approach to evaluating and constraining hardware assertions with absent scenarios. A Belief-failRate metric is proposed to predict the truth/falseness of generated assertions. By considering both the occurrences of free variable assignments and the conflicts of absent scenarios, we use the metric to sort true assertions in higher ranking and false assertions in lower ranking. Our Belief-failRate guided assertion constraining method leverages the quality of generated assertions. The experimental results show that the Belief-failRate framework performs better than the existing methods. In addition, the assertion evaluating and constraining procedure can find more assertions that cover new design functionality in comparison with the previous methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call