Abstract

This paper reviews the seemingly inevitable trend that software tools are no longer just a means for supporting the design, construction, and analysis of (large-scale) systems, but become so complex that each of them turns into a reality of their own, with its own "physics", that needs to be studied in its own right. The true effects of combining methodologies as diverse as classical static analysis, model checking, SAT and SMT solving, and dynamic methods such as simulation, runtime verification, testing, and learning, with their dedicated means of optimizations in terms of, e.g., BDD coding, parallelization, and various forms of abstraction and reduction, are very dependent on the particular tools and typically hardly predictable. Corresponding experimental investigations, today often supported by diverse and frequent tool challenges, provide interesting indications about the applied technology, but typically fail to provide sufficient evidence to transfer results to other settings and tools. Moreover, implementation-specific details often dominate the observed effects which thereby become invalid for drawing conceptual conclusions. On the other hand, requiring consequent in-depth analysis of any experimental observation in order to pinpoint the underlying conceptual consequences before publication would slow down the scientific exchange and also hinder the scientific progress. This paper analyzes the situation of today's software tools from a global perspective in terms of a SWOT (Strength, Weaknesses, Opportunities, Treats) analysis, identifies challenges, and establishes a global vision for overcoming current weaknesses.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call