CFP last date
20 December 2024
Reseach Article

Modified Stacked Generalization with Sequential learning

Published on March 2012 by Bhoomi Trivedi, Neha Kapadia
International Conference and Workshop on Emerging Trends in Technology
Foundation of Computer Science USA
ICWET2012 - Number 13
March 2012
Authors: Bhoomi Trivedi, Neha Kapadia
84d6b3fd-ccc1-4f2a-9329-d800f90e7f38

Bhoomi Trivedi, Neha Kapadia . Modified Stacked Generalization with Sequential learning. International Conference and Workshop on Emerging Trends in Technology. ICWET2012, 13 (March 2012), 38-43.

@article{
author = { Bhoomi Trivedi, Neha Kapadia },
title = { Modified Stacked Generalization with Sequential learning },
journal = { International Conference and Workshop on Emerging Trends in Technology },
issue_date = { March 2012 },
volume = { ICWET2012 },
number = { 13 },
month = { March },
year = { 2012 },
issn = 0975-8887,
pages = { 38-43 },
numpages = 6,
url = { /proceedings/icwet2012/number13/5412-1103/ },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Proceeding Article
%1 International Conference and Workshop on Emerging Trends in Technology
%A Bhoomi Trivedi
%A Neha Kapadia
%T Modified Stacked Generalization with Sequential learning
%J International Conference and Workshop on Emerging Trends in Technology
%@ 0975-8887
%V ICWET2012
%N 13
%P 38-43
%D 2012
%I International Journal of Computer Applications
Abstract

Nowadays machine learning techniques can be successfully applied to data mining tasks. In inductive machine learning, combination of several classifiers is very lively field and has shown favorable results compare to those of single expert systems for variety of scenarios. In this paper one of the ensemble learning method, i.e stacked generalization is modified to get better predictive accuracy. In stacking, by knowing its area of expertise, different diverse base classifiers are combined by a learnable combiner. So error can be generalized by the combiner. As diversity is the important aspect of the ensemble learning, in this paper sequential learning of the base classifier is experimented for that. To evaluate the performance of the proposed method different data sets like, IONOSPHERE, HYPOTHYROID, WAVEFORM are used. The experiments demonstrate the efficiency of the proposed model in terms of accuracy and time by yielding higher accuracy and lesser time relative to conventional staked generalization method

References
  1. DAVID H. WOLPERT, Ph.D., “Stacked Generalization”, Complex Systems Group, Theoretical Division, and Center for Non-linear Studies, Los Alamos.
  2. TING, K. M., & WITTEN, I. H. “Issues in stacked generalization.” Journal of Artificial Intelligence Research.
  3. TODOROVSKI, L., & D?ZEROSKI, S” Combining multiple models with meta decision trees”. In Proceedings of the Fourth European Conference on Principles of Data Mining and Knowledge Discovery, Berlin, Springer.
  4. ZENKO, B., & D?ZEROSKI, “Stacking with an extended set of meta-level attributes and MLR.” In Proceedings of the Thirteenth European Conference on Machine Learning, Berlin: Springer.
  5. ZENKO, B., TODOROVSKI, L., & D?ZEROSKI,”A comparison of stacking with MDTs to bagging, boosting, and other stacking methods”. In proceedings of the First IEEE International Conference on Data Mining, Los Alamitos, IEEE Computer Society.
  6. AGAPITO LEDEZMA, RICARDO ALER AND DANIEL BORRAJO” Empirical Study of a Stacking State-space “, Universidad Carlos III de Madrid Avda. de la Universidad, 3028911 Legan´es. Madrid (Spain)
  7. SASO D?ZEROSKI, BERNARD ?ZENKO “Is Combining Classifiers with Stacking Better than Selecting the Best One?” Department of Knowledge Technologies, Jo?zef Stefan Institute, Jamova 39, SI-1000 Ljubljana, Slovenia
  8. ELEXANDER K SEEWALD “exploring the parameter stacking state space”,Australian research institute of artificial intelligence,schottengasse 3, A-1010 Wien,Austria.
  9. CROUX C.JOOSSENS K. AND LEMMENS A. “Bagging a stacked classifier”.
  10. S.B. KOTSIANTI AND D. KANELLOPOULOS, “Combining Bagging, Boosting and Dagging for Classification Problems” Educational Software Development Laboratory Department of Mathematics University of Patras.
  11. MARTIN SEWELL “Ensemble Learning” Department of Computer Science University College London April 2007
  12. ROBI POLIKAR, “Ensemble based system in decision making”.
  13. METE OZAY AND FATOS TUNAY YARMAN VURAL, “On the Performance of Stacked Generalization Classifiers”(2008).
  14. CHRISTOPHER J. MERZ “Using Correspondence Analysis to Combine Classifiers”, Department of Information and Computer Science, University of California, Irvine
  15. ALEXANDER K. SEEWALD “Towards Understanding Stacking ” Studies of a General Ensemble Learning scheme , Phd thesis.
  16. David B. Skalak, “Prototype Selection for Composite Nearest Neighbor Classifiers”, Department of Computer Science University of Massachusetts Amherst, Massachusetts 01003-4160
Index Terms

Computer Science
Information Sciences

Keywords

Stacked Generalization sequential stacked generalization ensemble learning multiple classifier system