CFP last date
20 December 2024
Reseach Article

Comparing Detection Ratio of Three Static Analysis Tools

by Hanmeet Kaur Brar, Puneet Jai Kaur
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 124 - Number 13
Year of Publication: 2015
Authors: Hanmeet Kaur Brar, Puneet Jai Kaur
10.5120/ijca2015905749

Hanmeet Kaur Brar, Puneet Jai Kaur . Comparing Detection Ratio of Three Static Analysis Tools. International Journal of Computer Applications. 124, 13 ( August 2015), 35-40. DOI=10.5120/ijca2015905749

@article{ 10.5120/ijca2015905749,
author = { Hanmeet Kaur Brar, Puneet Jai Kaur },
title = { Comparing Detection Ratio of Three Static Analysis Tools },
journal = { International Journal of Computer Applications },
issue_date = { August 2015 },
volume = { 124 },
number = { 13 },
month = { August },
year = { 2015 },
issn = { 0975-8887 },
pages = { 35-40 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume124/number13/22167-2015905749/ },
doi = { 10.5120/ijca2015905749 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T23:14:20.926681+05:30
%A Hanmeet Kaur Brar
%A Puneet Jai Kaur
%T Comparing Detection Ratio of Three Static Analysis Tools
%J International Journal of Computer Applications
%@ 0975-8887
%V 124
%N 13
%P 35-40
%D 2015
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Static code analysis is a software verification activity in which source code is scrutinized for quality and security. In a Software Development Lifecycle, timely detection of flaws is beneficial and static analysis tools help us to detect flaws at a very early stage. Both commercial and open source static analysis tools are available today. Due to diverse user requirements and capabilities of the tools, a comparison between tools is required. Three open source static analysis tools for security are evaluated in this paper. These are Cppcheck, RATS and Flawfinder. They have been studied and compared to each other on the basis of detection ratio. For the purpose of obtaining the detection ratio, the vulnerabilities were categorized and intentionally introduced into the demo codes.

References
  1. McGraw, Gary, and John Viega. "Building Secure Software." In RTO/NATO Real-Time Intrusion Detection Symp. 2002.
  2. R. Jetley, B. Chelf. "Diagnosing Medical Device Software Defects Using Static Analysis." Coverity. Published in MD&DI (2009).
  3. H. K. Brar, P. J. Kaur, “Differentiating Integration Testing and Unit Testing”, 2015 2nd International Conference on Computing for Sustainable Global Development (INDIACom), IEEE, pp. 796-798.
  4. Wang, J. A., Wang, H., Guo, M., & Xia, M., “Security metrics for software system,” Proceedings of the 47th Annual Southeast Regional Conference, pp. 47, New York: ACM, 2009.
  5. P. Li, B. Cui. "A comparative study on software vulnerability static analysis techniques and tools." In Information Theory and Information Security (ICITIS), 2010 IEEE International Conference on, pp. 521-524. IEEE, 2010.
  6. M. Mantere, I. Uusitalo, and Juha Röning. "Comparison of static code analysis tools." In 2009 Third International Conference on Emerging Security Information, Systems and Technologies, pp. 15-22. IEEE, 2009.
  7. M. Howard and S. Lipner, “The Security Development Lifecycle: SDL: A process for developing demonstrably more secure software,” Microsoft Press, 2006, ISBN-13: 978-0735622142.
  8. B. Chess and J. West, “Secure programming with static analysis,” Addison-Wesley, 2007, ISBN-13: 978-0321424778.
  9. “On analyzing static analysis tools”, National security Agency Center for Assured Software, July 26, 2011, pp. 1-13.
  10. A. German, “Software static code analysis lessons learned,” Crosstalk, vol. 16, no. 11, 2003.
  11. Vincenzo Ciriello, Gabriella Carrozza and Stefano Rosati, “Practical experience and evaluation of continuous code static analysis with C++ test,” , In Proceedings of the 2013 International Workshop on Joining AcadeMiA and Industry Contributions to testing Automation, pp. 19-22, ACM New York, 2013.
  12. S. Lipner, “The trustworthy computing security development lifecycle.” Proceedings of the 20th Annual Computer Security Applications Conference (ACSAC), 2004.
  13. S.C. Johnson, “Lint, a C program checker” Computer Science Tech. report 65, Bell Laboratories, 1978.
  14. Patrik Hellström, “Tools for static code analysis: A survey,” Department of Computer and Information Science, Linköping University, 2009.
  15. Dan Cornell, “Static analysis techniques for testing application security,” OWASP San Antonio, 2008.
  16. Misha zitser, Richard Lippmann and Tim Leek, ‘Testing static analysis tools using exploitable buffer overflows from open source code,” ACM New York, 2004.
  17. S. Wagner, J. Jürjens, C. Koller, and P. Trischberger. "Comparing bug finding tools with reviews and tests." Lecture Notes in Computer Science 3502, 2005, pp. 40-55.
  18. H.H. AlBreiki, and Q. H. Mahmoud. "Evaluation of static analysis tools for software security." In Innovations in Information Technology (INNOVATIONS), 2014 10th International Conference on, pp. 93-98. IEEE, 2014
  19. RATS Information Website. URL: https://code.google.com/p/rough-auditing-tool-for-security/
  20. Flawfinder Website. URL: http://www.dwheeler.com/flawfinder/
  21. Cppcheck 1.69 Manual. URL: http://cppcheck.sourceforge.net/manual.pdf
  22. M. A. A. Mamun, A. Khanam, H. Grahn, and R. Feldt. "Comparing four static analysis tools for java concurrency bugs." In Third Swedish Workshop on Multi-Core Computing (MCC-10). 2010.
  23. Common Weakness Enumeration(CWE) website. URL: https://cwe.mitre.org/
Index Terms

Computer Science
Information Sciences

Keywords

Software development life cycle Static analysis Static analysis tools Detection Ratio Vulnerabilities Security Assessment.