We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

Cost Effective Approach on Feature Selection using Genetic Algorithms and LS-SVM Classifier

Published on None 2010 by E.P.Ephzibah
Evolutionary Computation for Optimization Techniques
Foundation of Computer Science USA
ECOT - Number 1
None 2010
Authors: E.P.Ephzibah
9c3e1b86-f6bb-4bb1-9e01-8bca128e1e5a

E.P.Ephzibah . Cost Effective Approach on Feature Selection using Genetic Algorithms and LS-SVM Classifier. Evolutionary Computation for Optimization Techniques. ECOT, 1 (None 2010), 16-20.

@article{
author = { E.P.Ephzibah },
title = { Cost Effective Approach on Feature Selection using Genetic Algorithms and LS-SVM Classifier },
journal = { Evolutionary Computation for Optimization Techniques },
issue_date = { None 2010 },
volume = { ECOT },
number = { 1 },
month = { None },
year = { 2010 },
issn = 0975-8887,
pages = { 16-20 },
numpages = 5,
url = { /specialissues/ecot/number1/1532-135/ },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Special Issue Article
%1 Evolutionary Computation for Optimization Techniques
%A E.P.Ephzibah
%T Cost Effective Approach on Feature Selection using Genetic Algorithms and LS-SVM Classifier
%J Evolutionary Computation for Optimization Techniques
%@ 0975-8887
%V ECOT
%N 1
%P 16-20
%D 2010
%I International Journal of Computer Applications
Abstract

This work focuses on the problem of diagnosing the disease in the earlier stage by applying a selection technique based on genetic algorithm and least square support vector machines. The implementation of the technique analyses the accuracy of the classifier as well as the cost effectiveness in the implementation. This technique will help us to diagnose the disease with a limited number of tests that could be performed with minimal amount. We use evolutionary computation which is a subfield of artificial intelligence or computational intelligence that involves combinatorial optimization problems. Evolutionary computation uses iterative progress, such as growth or development in a population. This population is then selected in a guided random search using parallel processing to achieve the desired end. Such processes are often inspired by biological mechanisms of evolution. The obtained results using the genetic algorithms approach show that the proposed method is able to find an appropriate feature subset and SVM classifier achieves better results than other methods.

References
  1. Baresel.A: Automating structural tests using evolutionary algorithms,(German) Diploma Theses, Humboldt_University of Berlin, Germany, 2000.
  2. C. J. C. Burges, "A Tutorial on Support Vector Machines for Pattern Recognition," Data Mining and Knowledge Discovery, vol. 2, pp.121-167, 1998
  3. A.L.Blum and R.L Rivest, “Training a three node Neural Networks is NP-Complete”, Neural Networks, vol. 5 , pp.117-127, 1992.
  4. R. Caruana and D. Freitag, “Greedy Attribute Selection”, Proc. 11th Int’l Conf. Machine Learning, pp. 28-36, 1994.
  5. S. Das, “Filters, Wrappers and a Boosting-Based Hybrid for Feature Selection,” Proc. 18th Int’l Conf. Machine Learning, pp. 74-81, 2001.
  6. M.Dash, K. Choi, P.Scheuermann and H.Liu, “Feature selection for clustering – a Filter Solution “, Proc. Second Int’l Conference. Data mining, pp.115-122, 2002.
  7. J.G. Dy and C.E. Brodley, “Feature Subset Selection and Order Identification for Unsupervised Learning,” Proc. 17th Int’l Conf. Machine Learning, pp. 247-254, 2000.
  8. M.A. Hall, “Correlation-Based Feature Selection for Discrete and Numeric Class Machine Learning,” Proc. 17th Int’l Conf. Machine Learning, pp. 359-366, 2000.
  9. Y. Kim, W. Street, and F. Menczer, “Feature Selection for Unsupervised Learning via Evolutionary Search,” Proc. Sixth ACM SIGKDD Int’l Conf. Knowledge Discovery and Data Mining, pp. 365-369, 2000.
  10. R. Kohavi and G.H. John, “Wrappers for Feature Subset Selection,” Artificial Intelligence, vol. 97, nos. 1-2, pp. 273-324, 1997.
  11. H. Liu and R. Setiono, “A Probabilistic Approach to Feature Selection-A Filter Solution,” Proc. 13th Int’l Conf. Machine Learning, pp. 319-327, 1996.
  12. Murphy P M, Aha Irvine D W. CA: University of California, Department of Information and Computer Science[EB/OL].http://www.ics.uci.edu/~mlearn/MLRepository.html,1994.
  13. A.Y. Ng, “On Feature Selection: Learning with Exponentially Many Irrelevant Features as Training Examples,” Proc. 15th Int’l Conf. Machine Learning, pp. 404-412, 1998.
  14. P.J.M van Laarhoven and E.H.L.Aarts :Simulated Annealing Theory and applications , (Netherlands : Kluwer Academic Pub-1992),PP 9-10.
  15. Wegener.J. Sthamer, H, Baresel,A (2001): Evolutionary Test Environment for Automatic Structural Testing. Special Issue of Information and Software Technology, vol 43, pp. 851 – 854, 2001.
  16. Wegener.J, Grochtmann ,M: Verifying Timing Constraints of Real-Time Systems by Means of Evolutionary Testing. Real-Time Systems, 15, pp. 275-298, 1998.
  17. E. Xing, M. Jordan, and R. Karp, “Feature Selection for High-Dimensional Genomic Microarray Data,” Proc. 15th Int’l Conf. Machine Learning, pp. 601-608, 2001.
  18. L. Yu and H. Liu, “Feature Selection for High-Dimensional Data: A Fast Correlation-Based Filter Solution,” Proc. 20th Int’l Con Machine Learning, pp. 856-863, 2003.
Index Terms

Computer Science
Information Sciences

Keywords

Feature selection Genetic Algorithm Simulated Annealing Least Square Support Vector Machines classification