CFP last date
20 January 2025
Reseach Article

A Two-Stage Tree based Meta-Classifier using Stack-Generalization

by B. Kalpana, Dr. V. Saravanan, Dr. K. Vivekanandan
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 36 - Number 3
Year of Publication: 2011
Authors: B. Kalpana, Dr. V. Saravanan, Dr. K. Vivekanandan
10.5120/4472-6270

B. Kalpana, Dr. V. Saravanan, Dr. K. Vivekanandan . A Two-Stage Tree based Meta-Classifier using Stack-Generalization. International Journal of Computer Applications. 36, 3 ( December 2011), 25-28. DOI=10.5120/4472-6270

@article{ 10.5120/4472-6270,
author = { B. Kalpana, Dr. V. Saravanan, Dr. K. Vivekanandan },
title = { A Two-Stage Tree based Meta-Classifier using Stack-Generalization },
journal = { International Journal of Computer Applications },
issue_date = { December 2011 },
volume = { 36 },
number = { 3 },
month = { December },
year = { 2011 },
issn = { 0975-8887 },
pages = { 25-28 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume36/number3/4472-6270/ },
doi = { 10.5120/4472-6270 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:22:11.197833+05:30
%A B. Kalpana
%A Dr. V. Saravanan
%A Dr. K. Vivekanandan
%T A Two-Stage Tree based Meta-Classifier using Stack-Generalization
%J International Journal of Computer Applications
%@ 0975-8887
%V 36
%N 3
%P 25-28
%D 2011
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Given a choice of classifiers each performing differently on different datasets the best option to assume is an ensemble of classifiers. An ensemble uses a single learning algorithm, whereas in this paper we propose a two stage stacking method with decision tree c4.5 as meta classifier. The base classifiers are Naïve Bayes, KNN and C4.5 tree. The decision tree learns from the classification output given by base classifiers after feature selection in the first stage on training data. The second stage classifies the test data using meta classifier. We prove that our algorithm provides better classification accuracy with UCI datasets.

References
  1. Brazdil. P., Gama. J. & Henery. R,”Characterizing the applicability of classification algorithms using meta level learning”, in European conference on machine learning, ECML-94, pp 83-102.
  2. Bernhard Pfahringer, Hilan Bensusan, Christophe Giraud-Carrier, “Meta-learning by landmarking various learning algorithms”.Proceedings of the Seventeenth International Conference on Machine Learning, ICML'2000. ISBN 1-55860-707-2, pp. 743–750. June 2000
  3. Wolpert, D.H (1992). “Stacked Generalization”. Neural Networks, 5(2):241–260.
  4. J. R. Quinlan,” Improved use of continuous attributes in c4.5”. Journal of Artificial Intelligence Research, 4:77-90, 1996
  5. XindongWu et al “Top 10 algorithms in data mining”, Survey paper , Springer-Verlag London Limited 2007, Published online: 4 December 2007.
  6. Fix E, Hodges JL, Jr (1951) Discriminatory analysis, and nonparametric discrimination. USAF School of Aviation Medicine, Randolph Field, Tex., Project 21-49-004, Rept. 4, Contract AF41(128)-31, February1951
  7. Tan P-N, Steinbach M, Kumar V (2006) Introduction to data mining. Pearson Addison-Wesley.
  8. Todorovski.L , Dzeroski.S.,,”Combining Classifiers with Meta Decision Trees”, Machine Learning Journal, volume 50, pp 223-249,2003
  9. Sigletos G, Paliouras G, Constantine D spyropoulus “ Combining Information Extraction Systems Using Voting and Stacked Generalization”, Journal of machine learning research, Volume 6,1751-1782,2005
  10. Ting K.M , Ian H Witten, “Issues in Stack Generalization”, Journal of artificial intelligence research, volume 10, pp 271-281,1999.
Index Terms

Computer Science
Information Sciences

Keywords

Data mining Classification Feature selection Stack generalization