CFP last date
20 January 2025
Reseach Article

Applicability of Inter Project Validation for Determination of Change Prone Classes

by Ruchika Malhotra, Vrinda Gupta, Megha Khanna
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 97 - Number 8
Year of Publication: 2014
Authors: Ruchika Malhotra, Vrinda Gupta, Megha Khanna
10.5120/17024-7313

Ruchika Malhotra, Vrinda Gupta, Megha Khanna . Applicability of Inter Project Validation for Determination of Change Prone Classes. International Journal of Computer Applications. 97, 8 ( July 2014), 1-8. DOI=10.5120/17024-7313

@article{ 10.5120/17024-7313,
author = { Ruchika Malhotra, Vrinda Gupta, Megha Khanna },
title = { Applicability of Inter Project Validation for Determination of Change Prone Classes },
journal = { International Journal of Computer Applications },
issue_date = { July 2014 },
volume = { 97 },
number = { 8 },
month = { July },
year = { 2014 },
issn = { 0975-8887 },
pages = { 1-8 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume97/number8/17024-7313/ },
doi = { 10.5120/17024-7313 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:23:32.734892+05:30
%A Ruchika Malhotra
%A Vrinda Gupta
%A Megha Khanna
%T Applicability of Inter Project Validation for Determination of Change Prone Classes
%J International Journal of Computer Applications
%@ 0975-8887
%V 97
%N 8
%P 1-8
%D 2014
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The research in the field of defect and change proneness prediction of software has gained a lot of momentum over the past few years. Indeed, effective prediction models can help software practitioners in detecting the change prone modules of a software, allowing them to optimize the resources used for software testing. However, the development of the prediction models used to determine change prone classes are dependent on the availability of historical data from the concerned software. This can pose a challenge in the development of effective change prediction models. The aim of this paper is to address this limitation by using the data from models based on similar projects to predict the change prone classes of the concerned software. This inter project technique can facilitate the development of generalized models which can be used to ascertain change prone classes for multiple software projects. It would also lead to optimization of critical time and resources in the testing and maintenance phases. This work evaluates the effectiveness of statistical and machine learning techniques for developing such models using receiver operating characteristic analysis. The observations of the study indicate varied results for the different techniques used.

References
  1. S. Watanbe, H. Kaiya and K. Kaijiri, "Adapting a Fault Prediction Model to Allow Inter Language Reuse," PROMISE Proceedings of the 4th international workshop on Predictor models in software engineering 2008, (2008) pp. 19-24.
  2. W. Li and S. Henry, "Object Oriented Metrics that Predict Maintainability", Journal of Systems and Software, vol. 23 ,(1993) pp. 111-122.
  3. L. Briand, J. Wust and H. Lounis,"Replicated Case Studies for Investigating Quality Factors in Object Oriented Designs. " Empirical Software Engineering: An International Journal, vol. 6 ,(2001) pp. 11-58.
  4. V. R. Basili, L. C. Briand and W. L. Melo, "A Validation of Object- Oriented Design Metrics as Quality Indicators," IEEE Transactions on Software Engineering, vol. 22, no. 10 , (1996) pp. 751-761.
  5. Y. Singh, A. Kaur and R. Malhotra. "Empirical Validation of Object-Oriented Metrics for Predicting Fault Proneness," Software Quality Journal, vol. 18, no. 1, (2010) pp. 3-35.
  6. R. Kohavi and D. Sommerfield, "Targeting Business Users with Decision Table Classifiers," Proceedings of IEEE Symposium on Information Visualization, (1998) pp. 102-105.
  7. K. Michalak and H. Kwasnicka,"Correlation-based Feature Selec-tion Strategy in Neural Classification," Sixth International Conference on Intelligent Systems Design and Applications, vol. 1, (2006) pp. 741-746.
  8. B. Kitchenham, L. Mendes, "A Systematic Review of Cross- vs. Within-Company Cost Estimation Studies", IEEE Transactions on Software Engineering, vol. 33, no. 5, (2007) pp. 316-329.
  9. T. Zimmermann, N. Nagappan, H. Gall, E. Giger and B. Murphy, "Cross-project Defect Prediction A Large Scale Experiment on Data vs. Domain vs. Process", in Proceedings of the 7th joint meeting of the European Software Engineering Conference and the ACM, (2009) pp. 91–100.
  10. G. Canfora, A. D. Lucia, "Multi-objective cross project defect prediction", Software Testing, Verification and Validation (ICST), 2013 IEEE Sixth International Conference, (2013) pp. 252-261.
  11. Z. He, F. Shu, An investigation on the feasibility of cross-project defect prediction", Automated Software Engineering, vol. 19, no. 2, (2012) pp. 167-199.
  12. A. R. Han, S. Jeon, D. Bae and J. Hong," Behavioral Dependency Measurement for Change Proneness prediction in UML 2. 0 Design Models," Computer Software and Applications 32nd Annual IEEE International, (2008) pp. 76-83.
  13. O. Elish, Al Rahman, "A suite of metrics for quantifying historical changes to predict future change-prone classes in object-oriented software. " Journal of Software: Evolution And Process. J. Softw. : Evol. and Proc. , vol. 25, (2013) pp. 407–437.
  14. Zhou Y, Leung H, Xu B "Examining the potentially confounding effect of class size on the associations between object oriented metrics and change proneness", Software Engineering, IEEE Transactions, vol. 35 , no. 5 , (2009), pp. 607-623.
  15. H. Lu, Y. Zhou, B. Xu, H. Leung and L. Chen, "The ability of object-oriented metrics to predict change-proneness: a meta-analysis", Empirical Software Engineering Journal June 2012, vol. 17, no. 3, (2012) pp. 200-242.
  16. M. D'Ambros, M. Lanza and R. Robbes, "On the Relationship Be-tween Change Coupling and Software Defects," 16th Working Conference on Reverse Engineering, (2009) pp. 135-144.
  17. R. Malhotra and M. Khanna, "Investigation of Relationship be-tween Object-oriented Metrics and Change Proneness," International Journal of Machine Learning and Cybernetics, Springer vol. 4, no. 4, (2013) pp. 273-286.
  18. R. Malhotra and M. Khanna, "Inter Project Validation for Change Proneness Prediction using Object Oriented Metrics," Software Engineering- An International Journal , vol. 3, no. 3 (2012) pp. 21-31.
  19. Kerlinger, F. N. Foundations of behavioral research (3rd ed. ). Fort Worth: Holt, Rinehart and Winston, Inc. (1986)
  20. S. R. Chidamber and C. F. Kemerer,"A Metrics Suite for Object Ori-ented Design," IEEE Transactions of Software Engineering, vol. 20, no. 6 (1994)pp. 476-493.
  21. KK. Aggarwal, Y. Singh, A. Kaur and R. Malhotra, "Empirical Analysis for Investigating the Effect of Object-Oriented Metrics on Fault Proneness: A Replicated Case Study", Software Process: Improvement and Practice, vol. 16, no. 1, (2009) pp. 39-62.
  22. Basili VR, Briand LC, Melo WL "A validation of object oriented design metrics as quality indicators", IEEE Trans Softw. Engg , vol. 22, no. 10, (1996) pp. 751–761.
  23. Hall MA, "Correlation-based feature selection for discrete and numeric class machine learning", proceeding of the seventeenth international conference on machine learning, (2010) pp. 359–366.
  24. Hosmer D, Lemeshow S -Applied logistic regression. Wiley, New York, (1989).
  25. Kristína Machová, František Bar?ák, Peter Bednár- A Bagging Technique using Decision Trees in the Role of Base Classifiers, (2006).
  26. Yoav Freund Robert E. Schapire. -A short introduction to boosting, (2009).
  27. Ben-Gal I. , Bayesian Networks, in Ruggeri F. , Faltin F. & Kenett R. , Encyclopedia of Statistics in Quality & Reliability, Wiley & Sons, (2007).
  28. C. Catal, B. Diri,"An Artificial Immune System Approach for Fault Prediction in Object-Oriented Software", Dependability of Computer Systems, 2007. DepCoS-RELCOMEX '07. 2nd International Conference, (2007) pp. 238-245.
  29. Brownlee, J. - Artificial Immune Recognition System (AIRS) A review & Analysis, Technical Report 1-02, Swinburne University of Technology, Australia, (2005).
  30. Khalid A, H. M. Abdul, "Artificial Immune Clonal Selection Classification Algorithms for Classifying Malware and Benign Processes Using API Call Sequences", IJCSNS, (2010) pp. 31-39.
  31. L. Briand, J. Wust, J. Daly and D. V. Porter, " Exploring the Relationships between Design Measures and Software Quality in Object-oriented Systems," Journal of Systems and Software, vol. 51, no. 3, (2000) pp. 245-273.
  32. Stone M" Cross-validator choice and assessment of statistical predictions", Journal of the Royal Statistical Society. Series B (Methodological), vol. 36, no. 2, (1974), pp. 111-147.
Index Terms

Computer Science
Information Sciences

Keywords

Change proneness Inter project validation Object oriented metrics Receiver operating characteristic analysis.