CFP last date
20 January 2025
Reseach Article

An Optimized Classifier Frame Work based on Rough Set and Random Tree

by Nidhi Patel
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 160 - Number 9
Year of Publication: 2017
Authors: Nidhi Patel
10.5120/ijca2017912844

Nidhi Patel . An Optimized Classifier Frame Work based on Rough Set and Random Tree. International Journal of Computer Applications. 160, 9 ( Feb 2017), 1-7. DOI=10.5120/ijca2017912844

@article{ 10.5120/ijca2017912844,
author = { Nidhi Patel },
title = { An Optimized Classifier Frame Work based on Rough Set and Random Tree },
journal = { International Journal of Computer Applications },
issue_date = { Feb 2017 },
volume = { 160 },
number = { 9 },
month = { Feb },
year = { 2017 },
issn = { 0975-8887 },
pages = { 1-7 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume160/number9/27098-2017912844/ },
doi = { 10.5120/ijca2017912844 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:06:13.000626+05:30
%A Nidhi Patel
%T An Optimized Classifier Frame Work based on Rough Set and Random Tree
%J International Journal of Computer Applications
%@ 0975-8887
%V 160
%N 9
%P 1-7
%D 2017
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Over the past two decades, Machine Learning has become one of the mainstays of information technology. Machine learning is concerned with the development of algorithms and achieves optimization classification of attributes. Classification under the decision tree is the prediction approach of data mining techniques. In the decision tree, classification algorithm has the most common classifier to build tree. This research work proposes an optimized classifier framework based on rough set and random tree classifier. Therefore, this paper puts forward a new algorithm, which combined with rough set theory and random Tree, here rough set theory used to reduce the attributes in the decision system, and uses the reduct data, as the input of decision tree. Random tree algorithm is increase high accuracy rate of the result. This article has put new concepts into practice, and the result of these concepts shows that rough set with random tree classifier have high accuracy and low time consumption compared over the rough set based J48 classifier.

References
  1. Z. Pawlak.: Rough Sets.International Journal of Computer and Information Sciences, vol. 11 pp. 341–356, pp 341–356; (1982).
  2. Dash, M., & Liu, H.: Consistency-based search in feature selection. Artificial Intelligence, vol.151, pp 155–176, (2003).
  3. Dai, J. H.:set approach to incomplete data. Information Sciences, consistency –based search in feature 241.vol. 43,–572002, pp 43-57 (2013).
  4. I.D.untsch, G. Gediga. Rough Set Data Analysis.In: A. Kent & J. G.Williams (Eds.Encyclopedia of Computer Science and Technology), vol.43, pp. 281–301, (2000).
  5. H. Sever.: The status of research on rough sets for knowledge discovery in databases. In: Proceedings of the Second International Conference on Nonlinear Problems in Aviation and Aerospace (ICNPAA98), vol. 2, pp.673–680,( 1998).
  6. Ahmad, A., & Dey, L.: A feature selection technique for classificatory analysis. Pattern Recognition Letters, 26(1)pp, 43–56, (2005).
  7. Chai, J. Y., & Liu, J. N. C.: A novel believable rough set approach for supplier selection. Expert Systems with Applications, Vol. 41, Issue Pp 92–104,1, January (2014).
  8. A. Skowron, Z. Pawlak, J. Komorowski, L. Polkowski. :A rough set perspective on data and knowledge. Handbook of data mining and knowledge discovery, pp. 134–149, Oxford University Press, (2002).
  9. A. Skowron, S. K. Pal. Special issue: Rough sets, pattern recognition and data mining. Pattern Recognition Letters, vol. 24, pp 829–933, (2003).
  10. Hu, Q. H., Zhao, H., Xie, Z. X., & Yu, D. R.: Consistency based attribute reduction. In Z.-H. Zhou, H. Li, & Q. Yang (Eds.), PAKDD LNCS (LNAI) Vol. 4426.pp 96-107 (2007). Heidelberg: Springer.
  11. Deng, T. Q., Yang, C. D., & Wang, X. F. A: reduct derived from feature selection.Pattern Recognition Letters, vol.33,pp 1628–1646; (2012)
  12. Z. Pawlak. Rough Sets: Theoretical Aspects of Reasoning About Data Kluwer Academic Publishing, 1991.
  13. Kai Zhenga, Jie Hua, , , Zhenfei Zhanb, Jin Ma, :An enhancement for heuristic attribute reduction algorithm in roughset,vol 41,pp 6748-6754( 2014)
  14. https://archive.ics.uci.edu/ml/datasets.html.
  15. Loh WY. Regression by parts: fitting visually interpretable models with GUIDE. In: Chen C, Hordle W, Unwin A, eds. Handbook of Data Visualization. New York: Springer,vol.28, pp447–469; (2008).
  16. Chaudhuri P, Lo WD, Loh WY, Yang CC.: Generalized regression trees.vol 5:pp641–666; (1995).
  17. Loh WY.: Regression tree models for designed experiments. IMS Lecture Notes-Monograph Series, vol49,pp210–228;( 2006).
  18. Xiang Zhuoyuan and Zhang Lei ,:Research on an optimized C 4.5 Algorithm Based on Rough set Theory.DOI:10.1109/ICMeCG.2012.74Beijing, pp. 72-274International Conference on Management of e-Commerce and e-Government;(2012).
  19. L. Breiman. :Random forests. Machine Learning, vol45,pp5–32, (2001).
  20. Liaw and Matthew Wiener: Classification and Regression by randomForest Andy. Vol. 2/3, (December 2000).
  21. Aeberhard's second ref. above, or email to stefan '@' coral.cs.jcu.edu.au.
  22. G.Gong (Carnegie-Mellon University) via BojanCestnik Jozef Stefan Institute Jamova3961000Ljubljana Yugoslavia(tel.: (38)(+61) 214-399 ext.287) }
  23. Igor Kononenko, University E.Kardelj Faculty for electrical engineering Trzaska25 61000 Ljubljana (tel.: (38)(+61) 265-161
  24. Purva Sewaiwar, Kamal Kant Verma :Comparative Study of Various Decision Tree Classification Algorithm Using WEKA. International Journal of Emerging Research in Management &Technology ,Vol 4, pp 2278-9359(2015).
Index Terms

Computer Science
Information Sciences

Keywords

Data mining Rough Set Reduce Attributes Decision Tree Random Tree MATLAB WEKA