Abstract

AbstractPrevious studies have demonstrated the usefulness of employing automated static analysis tools (ASAT) and techniques to detect security bugs in software systems. However, these studies are usually focused on analyzing the effectiveness of the tools using open‐source tools based on C/C++ source code. The choice for making an appropriate decision on the most suitable tool for bug detection in Java code software remains a relatively unexplored domain. To address this deficiency, this study empirically evaluates eight widely used ASATs, namely, Findbug, PMD, YASCA, LAPSE+, JLint, Bandera, ESC/Java, and Java Pathfinder using the Juliet Test Suite (Test Suite v1.2). Additionally, we assessed the performance of the detection capabilities for the aforementioned bug detection tools using robust performance measures such as precision, recall, Youden index, and the OWASP web benchmark evaluation (WBE). The experimental results show that the tools obtain precision values ranging from 83% to 90.7% based on the studied datasets. Specifically, the Java Pathfinder achieves the best precision score of 90.7%, followed by YASCA and Bandera with a precision score of 88.7% and 83%, respectively. Similarly, Bandera, ESC/Java, and Java Pathfinder obtain a Youden index of 0.8, which indicates the effectiveness of the tools in detecting security bugs in Java source code.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call