Abstract

Automated model-based test generation presents a viable alternative to the costly manual test creation currently employed for regression testing of web apps. However, existing model inference techniques rely on threshold-based whole-page comparison to establish state equivalence, which cannot reliably identify near-duplicate web pages in modern web apps. Consequently, existing techniques produce inadequate models for dynamic web apps, and fragile test oracles, rendering the generated regression test suites ineffective. We propose a model-based test generation technique, FRAGGEN, that eliminates the need for thresholds, by employing a novel state abstraction based on page fragmentation to establish state equivalence. FRAGGEN also uses fine-grained page fragment analysis to diversify state exploration and generate reliable test oracles. Our evaluation shows that FRAGGEN outperforms existing whole-page techniques by detecting more near-duplicates, inferring better web app models and generating test suites that are better suited for regression testing. On a dataset of 86,165 state-pairs, FRAGGEN detected 123% more near-duplicates on average compared to whole-page techniques. The crawl models inferred by FRAGGEN have 62% more precision and 70% more recall on average. FRAGGEN also generates reliable regression test suites with test actions that have nearly 100% success rate on the same version of the web app even if the execution environment is varied. The test oracles generated by FRAGGEN can detect 98.7% of the visible changes in web pages while being highly robust, making them suitable for regression testing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call