Low-Level analysis of Worst Case Execution Time (WCET) is an important field for real-time system validation. It stands between computer architecture and mathematics, as it relies strongly on variants of abstract interpretation. One of the features that causes the largest uncertainty regarding WCET evaluation for low-level analysis of sequential execution on a single processor is taking Cache Memory-related Delays (CMRD) and Cache-related Preemption Delays (CRPD) correctly into account. Research work from the 1990s provides a good basic framework for this problem as long as a task runs without preemption. But when preemption of tasks is allowed, although several formalisms exist, their predictive power is lower and the usual approach relies on analyses of NP-hard problems. In this article, we want to show some potential advantages of using a formalism inspired by Quantum Computing (QC) to evaluate CMRDs with preemptions while avoiding the NP-hard problem underneath. The experimental results, with a classic (non-quantum) numerical approach, on a selection of Malardalen benchmark programs display very good accuracy, while the complexity of the evaluation is a low-order polynomial of the number of memory accesses. While it is not yet a fully parallel quantum algorithm, we provide a first roadmap on how to reach such an objective.