
handle: 10831/113263
Various static code analysis tools have been designed to automatically detect software faults and security vulnerabilities. This paper aims to 1) conduct an empirical evaluation to assess the performance of five free and state-of-the-art static analysis tools in detecting Java security vulnerabilities using a well-defined and repeatable approach; 2) report on the vulnerabilities that are best and worst detected by static Java analyzers. We used the Juliet benchmark test suite in a controlled experiment to assess the effectiveness of five widely used Java static analysis tools. The vulnerabilities were successfully detected by one, two, or three tools. Only one vulnerability has been detected by four tools. The tools missed 13% of the Java vulnerability categories appearing in our experiment. More critically, none of the five tools could identify all the vulnerabilities in our experiment. We conclude that, despite recent improvements in their methodologies, current state-of-the-art static analysis tools are still ineffective for identifying the security vulnerabilities occurring in a small-scale, artificial test suite.
Juliet test suite, Empirical evaluation, Electrical engineering. Electronics. Nuclear engineering, performance metrics, static analysis tools, security vulnerabilities, Java, TK1-9971
Juliet test suite, Empirical evaluation, Electrical engineering. Electronics. Nuclear engineering, performance metrics, static analysis tools, security vulnerabilities, Java, TK1-9971
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 3 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
