CFP last date
20 January 2025
Reseach Article

Literature Review on Feature Selection Methods for High-Dimensional Data

by D. Asir Antony Gnana Singh, S. Appavu Alias Balamurugan, E. Jebamalar Leavline
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 136 - Number 1
Year of Publication: 2016
Authors: D. Asir Antony Gnana Singh, S. Appavu Alias Balamurugan, E. Jebamalar Leavline
10.5120/ijca2016908317

D. Asir Antony Gnana Singh, S. Appavu Alias Balamurugan, E. Jebamalar Leavline . Literature Review on Feature Selection Methods for High-Dimensional Data. International Journal of Computer Applications. 136, 1 ( February 2016), 9-17. DOI=10.5120/ijca2016908317

@article{ 10.5120/ijca2016908317,
author = { D. Asir Antony Gnana Singh, S. Appavu Alias Balamurugan, E. Jebamalar Leavline },
title = { Literature Review on Feature Selection Methods for High-Dimensional Data },
journal = { International Journal of Computer Applications },
issue_date = { February 2016 },
volume = { 136 },
number = { 1 },
month = { February },
year = { 2016 },
issn = { 0975-8887 },
pages = { 9-17 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume136/number1/24116-2016908317/ },
doi = { 10.5120/ijca2016908317 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T23:37:03.934064+05:30
%A D. Asir Antony Gnana Singh
%A S. Appavu Alias Balamurugan
%A E. Jebamalar Leavline
%T Literature Review on Feature Selection Methods for High-Dimensional Data
%J International Journal of Computer Applications
%@ 0975-8887
%V 136
%N 1
%P 9-17
%D 2016
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Feature selection plays a significant role in improving the performance of the machine learning algorithms in terms of reducing the time to build the learning model and increasing the accuracy in the learning process. Therefore, the researchers pay more attention on the feature selection to enhance the performance of the machine learning algorithms. Identifying the suitable feature selection method is very essential for a given machine learning task with high-dimensional data. Hence, it is required to conduct the study on the various feature selection methods for the research community especially dedicated to develop the suitable feature selection method for enhancing the performance of the machine learning tasks on high-dimensional data. In order to fulfill this objective, this paper devotes the complete literature review on the various feature selection methods for high-dimensional data.

References
  1. Saeys, Y, Inza, I & Larrañaga, P 2007, ‘A review of feature selection techniques in bioinformatics. Bioinformatics’, vol. 23, no. 19, pp.2507-2517
  2. Bolón-Canedo, V, Sánchez-Maroño, N & Alonso-Betanzos, A, 2013, ‘A review of feature selection methods on synthetic data’, Knowledge and information systems, vol. 34, no.3, pp.483-519.
  3. Hall, MA 1999, Correlation-based feature selection for machine learning, Ph.D. thesis, The University of Waikato, NewZealand.
  4. Liu, H & Setiono, R 1996, ‘A probabilistic approach to feature selection-a filter solution’, Proceedings of Eighteenth International Conference on Machine Learning, Italy, pp. 319-327.
  5. Lisnianski, A, Frenkel, I & Ding, Y, 2010, ‘Multi-state system reliability analysis and optimization for engineers and industrial managers’, Springer, New York.
  6. Lin, S.W, Tseng, TY, Chou, SY & Chen, SC 2008, ‘A simulated-annealing-based approach for simultaneous parameter optimization and feature selection of back-propagation networks’, Expert Systems with Applications, vol. 34, no.2, pp.1491-1499.
  7. Meiri, R & Zahavi, J 2006, ‘Using simulated annealing to optimize the feature selection problem in marketing applications’, European Journal of Operational Research, vol.171, no.3, pp.842-858.
  8. Zhang, H & Sun, G 2002, ‘Feature selection using tabu search method’, Pattern recognition, vol. 35, no.3, pp.701-711.
  9. Tahir, MA, Bouridane, A & Kurugollu, F 2007, ‘Simultaneous feature selection and feature weighting using Hybrid Tabu Search/K-nearest neighbor classifier’ Pattern Recognition Letters, vol. 28, no.4, pp.438-446.
  10. Aghdam, MH, Ghasem-Aghaee, N & Basiri, ME 2009, ‘Text feature selection using ant colony optimization’, Expert systems with applications, vol. 36, no.3, pp.6843-6853.
  11. Kanan, HR & Faez, K 2008, ‘An improved feature selection method based on ant colony optimization (ACO) evaluated on face recognition system’, Applied Mathematics and Computation, vol. 205, no.2, pp.716-725.
  12. Sivagaminathan, RK & Ramakrishnan, S 2007, ‘A hybrid approach for feature subset selection using neural networks and ant colony optimization, Expert systems with applications, vol. 33, no.1, pp.49-60.
  13. Sreeja, NK & Sankar, A 2015, ‘Pattern Matching based Classification using Ant Colony Optimization based Feature Selection’, Applied Soft Computing, vol. 31, pp.91-102.
  14. Welikala, R.A, Fraz, MM, Dehmeshki, J, Hoppe, A, Tah, V, Mann, S, Williamson, TH & Barman, SA 2015, ‘Genetic algorithm based feature selection combined with dual classification for the automated detection of proliferative diabetic retinopathy’, Computerized Medical Imaging and Graphics, vol. 43, pp.64-77.
  15. Erguzel, TT, Ozekes, S, Tan, O & Gultekin, S 2015, ‘Feature Selection and Classification of Electroencephalographic Signals an Artificial Neural Network and Genetic Algorithm Based Approach’, Clinical EEG and Neuroscience, vol. 46, no.4, pp.321-326.
  16. Oreski, S & Oreski, G 2014, ‘Genetic algorithm-based heuristic for feature selection in credit risk assessment, Expert systems with applications, vol. 41, no.4, pp.2052-2064.
  17. Li, S, Wu, H, Wan, D & Zhu, J, 2011, ‘An effective feature selection method for hyperspectral image classification based on genetic algorithm and support vector machine’, Knowledge-Based Systems, vol. 24, no.1, pp.40-48.
  18. Das, N, Sarkar, R, Basu, S, Kundu, M, Nasipuri, M & Basu, DK 2012, ‘A genetic algorithm based region sampling for selection of local features in handwritten digit recognition application, Applied Soft Computing, vol.12, no.5, pp.1592-1606.
  19. Wang, Y, Chen, X, Jiang, W, Li, L, Li, W, Yang, L, Liao, M, Lian, B, Lv, Y, Wang, S & Wang, S 2011, ‘Predicting human microRNA precursors based on an optimized feature subset generated by GA–SVM’, Genomics, vol. 98, no.2, pp.73-78.
  20. Xue, B, Zhang, M & Browne, WN 2013, ‘Particle swarm optimization for feature selection in classification: A multi-objective approach’, IEEE Transactions on Cybernetics, vol. 43, no.6, pp.1656-1671.
  21. Chen, LF, Su, CT, Chen, KH & Wang, PC 2012, ‘Particle swarm optimization for feature selection with application in obstructive sleep apnea diagnosis’, Neural Computing and Applications, vol. 2, no. 8, pp.2087-2096.
  22. Yang, H, Du, Q & Chen, G 2012, ‘Particle swarm optimization-based hyperspectral dimensionality reduction for urban land cover classification, ‘IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 5 no.2, pp.544-554.
  23. Lin, SW, Ying, KC, Chen, SC & Lee, ZJ 2008, ‘Particle swarm optimization for parameter determination and feature selection of support vector machines’, Expert systems with applications, vol. 35, no. 4, pp.1817-1824.
  24. Liu, H & Setiono, R 1995, ‘Chi2: Feature selection and discretization of numeric attributes’, Proceedings of the IEEE Seventh International Conference on Tools with Artificial Intelligence, Washington DC, USA, pp. 388-391.
  25. Dash, M & Liu, H 1997, ‘Feature selection for classification’, Intelligent data analysis, vol. 1, no.1, pp.131-156.
  26. Kohavi, R & John, GH 1997, ‘Wrappers for feature subset selection’, Artificial intelligence, vol. 97, no.1, pp.273-324.
  27. Inza, I, Larrañaga, P, Etxeberria, R & Sierra, B 2000, ‘Feature subset selection by Bayesian network-based optimization’, Artificial intelligence, vol. 123, no. 1, pp.157-184.
  28. Grimaldi, M, Cunningham, P & Kokaram, A 2003, ‘An evaluation of alternative feature selection strategies and ensemble techniques for classifying music’, Proceedings of Fourteenth European Conference on Machine Learning and the Seventh European Conference on Principles and Practice of Knowledge Discovery in Databases, Dubrovnik, Croatia
  29. Dy, JG & Brodley, CE 2000, ‘Feature subset selection and order identification for unsupervised learning’, proceedings In Proceedings of the Seventeenth International Conference on Machine Learning, p. 247–254.
  30. Aha, DW & Bankert, RL 1996, ‘A Comparative Evaluation of Sequential Feature Selection Algorithms, Springer, New York.
  31. Maldonado, S & Weber, R 2009, ‘A wrapper method for feature selection using support vector machines’, Information Sciences, 179(13), pp.2208-2217.
  32. Gütlein, M, Frank, E, Hall, M & Karwath, A 2009, ‘March. Large-scale attribute selection using wrappers, Proceeding of IEEE Symposium on Computational Intelligence and Data Mining, Nashville, TN, USA, pp. 332-339.
  33. Kabir, MM, Islam, MM & Murase, K 2010, ‘A new wrapper feature selection approach using neural network’, Neurocomputing, vol. 73, no. 16, pp.3273-3283.
  34. Stein, G, Chen, B, Wu, AS & Hua, KA 2005, ‘March. Decision tree classifier for network intrusion detection with GA-based feature selection’ Proceedings of the forty-third ACM Annual Southeast regional conference, Kennesaw, GA, USA, vol. 2, pp. 136-141.
  35. Zhuo, L, Zheng, J, Li, X, Wang, F, Ai, B & Qian, J 2008, ‘A genetic algorithm based wrapper feature selection method for classification of hyperspectral images using support vector machine’ Proceedings of Geoinformatics and Joint Conference on GIS and Built Environment: Classification of Remote Sensing Images, pp. 71471J-71471J.
  36. Quinlan JR 2014, ‘C4.5: programs for machine learning’, Morgan Kaufmann publishers, San Mateo, California.
  37. Koistinen, P & Holmström, L 1991, ‘Kernel regression and backpropagation training with noise, Proceedings of IEEE International Joint Conference on Neural Networks, pp. 367-372.
  38. Baluja, S 1994, ‘Population-based incremental learning a method for integrating genetic search based function optimization and competitive learning’, Technical Report No. CMU-CS-94-163, Carnegie Mellon University, Pittsburgh, Pa.
  39. Buntine, W 1991, ‘Theory refinement on Bayesian networks’ Proceedings of the Seventh conference on Uncertainty in Artificial Intelligence, Barcelona, Spain, pp. 52-60.
  40. Loughrey, J & Cunningham, P 2005, ‘Overfitting in wrapper-based feature subset selection: The harder you try the worse it gets’ Proceedings of Research and Development in Intelligent Systems, Springer London. pp. 33-43.
  41. Freeman, C, Kulić, D & Basir, O 2015, ‘An evaluation of classifier-specific filter measure performance for feature selection, Pattern Recognition, vol.48, no.5, pp.1812-1826.
  42. Chandrashekar G & Sahin, F 2014, ‘A survey on feature selection methods’, Computers & Electrical Engineering, vol. 40, no.1, pp.16-28.
  43. Guyon, I, Weston, J, Barnhill, S & Vapnik, V, 2002, ‘Gene selection for cancer classification using support vector machines, Machine learning, vol. 46, no. 1-3, pp.389-422.
  44. Quinlan, JR, 1986, ‘Induction of decision trees’, Machine learning, vol. 1, no.1, pp.81-106.
  45. Tibshirani, R, Saunders, M, Rosset, S, Zhu, J & Knight, K 2005, ‘Sparsity and smoothness via the fused lasso’, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 67, no. 1, pp. 91-108.
  46. Ma, S & Huang, J 2008, ‘Penalized feature selection and classification in bioinformatics’, ‘Briefings in bioinformatics, vol. 9, no. 5, pp.392-403.
  47. Neumann, J, Schnörr, C & Steidl, G 2004, ‘SVM-based feature selection by direct objective minimisation’, Proceeding of the twenty-sixth DAGM Symposium on Pattern Recognition, Germany, pp. 212-219.
  48. Xiao, Z, Dellandrea, E, Dou, W & Chen, L 2008, ‘ESFS: A new embedded feature selection method based on SFS’, Rapports de recherché.
  49. Maldonado, S, Weber, R & Famili, F 2014, ‘Feature selection for high-dimensional class-imbalanced data sets using Support Vector Machines’, Information Sciences, vol. 286, pp.228-246.
  50. Kira, K & Rendell, LA 1992, ‘A practical approach to feature selection’, Proceedings of the ninth international workshop on Machine learning, Aberdeen, Scotland, UK (pp. 249-256).
  51. Kononenko, I 1994, ‘Estimating attributes: analysis and extensions of RELIEF’. Proceeding of European Conference on Machine Learning, Catania, Italy, pp. 171-182.
  52. Holte, RC 1993, ‘Very simple classification rules perform well on most commonly used datasets, Machine learning, vol.11, no.1, pp.63-90.
  53. Yang, HH & Moody, JE 1999, ‘Data Visualization and Feature Selection: New Algorithms for Nongaussian Data’, Advances in Neural Information Processing Systems, vol. 99, pp. 687–693.
  54. Peng, H, Long, F & Ding C 2005, ‘Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy’, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no.8, pp.1226-1238.
  55. Battiti, R 1994, ‘Using mutual information for selecting features in supervised neural net learning’, ‘IEEE Transactions on Neural Networks, vol. 5, no. 4, pp.537-550.
  56. Fleuret, F 2004, ‘Fast binary feature selection with conditional mutual information’, The Journal of Machine Learning Research, vol. 5, pp.1531-1555.
  57. Meyer, PE & Bontempi, G 2006, ‘On the use of variable complementarity for feature selection in cancer classification’, Applications of Evolutionary Computing. pp. 91-102.
  58. Lin, D & Tang, X 2006, ‘Conditional infomax learning: an integrated framework for feature extraction and fusion’, Proceeding of ninth European Conference on Computer Vision, Graz, pp. 68-82.
  59. Brown, G, Pocock, A, Zhao, MJ & Luján, M 2012, ‘Conditional likelihood maximisation: a unifying framework for information theoretic feature selection’, The Journal of Machine Learning Research, vol.13, no.1, pp.27-66.
  60. Song, Q, Ni, J & Wang, G 2013, ‘A fast clustering-based feature subset selection algorithm for high-dimensional data’, IEEE Transactions on Knowledge and Data Engineering, vol. 25, no.1, pp.1-14.
  61. Dhillon, IS, Mallela, S & Kumar, R 2003, ‘A divisive information theoretic feature clustering algorithm for text classification’, The Journal of Machine Learning Research, vol. 3, pp.1265-1287.
  62. Li, Y, Luo, C, & Chung, SM 2008, ‘Text clustering with feature selection by using statistical data. IEEE Transactions on Knowledge and Data Engineering, vol. 20, no.5, pp.641-652.
  63. Cai, D, Zhang, C & He, X 2010, ‘Unsupervised feature selection for multi-cluster data’, Proceedings of the sixteenth ACM SIGKDD international conference on Knowledge discovery and data mining, Washington, pp. 333-342.
  64. Chow, TW & Huang, D 2005, ‘Estimating optimal feature subsets using efficient estimation of high-dimensional mutual information’, IEEE Transactions on Neural Networks, vol.16, no.1, pp.213-224.
  65. Mitra S, & Acharya T 2005, ‘Data mining: multimedia, soft computing, and bioinformatics’ John Wiley & Sons, New Jersey.
  66. Sotoca, JM & Pla, F 2010, ‘Supervised feature selection by clustering using conditional mutual information-based distances’, Pattern Recognition, vol. 43, no.6, pp.2068-2081
  67. Freeman, C, Kulić, D & Basir, O 2015, ‘An evaluation of classifier-specific filter measure performance for feature selection, Pattern Recognition, vol.48, no.5, pp.1812-1826.
  68. Frénay, B, Doquire, G & Verleysen, M 2014, ‘Estimating mutual information for feature selection in the presence of label noise, Computational Statistics & Data Analysis, vol. 71, pp.832-848.
  69. Tabakhi, S, Moradi, P & Akhlaghian, F 2014, ‘An unsupervised feature selection algorithm based on ant colony optimization’, Engineering Applications of Artificial Intelligence, vol. 32, pp.112-123.
  70. Bermejo, P, Gámez, J & Puerta, J 2008, ‘On incremental wrapper-based attribute selection: experimental analysis of the relevance criteria, Proceedings of International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems, France, pp.638-645.
  71. Ruiz, R, Riquelme, JC & Aguilar-Ruiz, JS 2006, ‘Incremental wrapper-based gene selection from microarray data for cancer classification’ Pattern Recognition, vol. 39, no. 12, pp.2383-2392.
  72. Xie, J, Xie, W, Wang, C & Gao, X 2010, ‘A Novel Hybrid Feature Selection Method Based on IFSFFS and SVM for the Diagnosis of Erythemato-Squamous Diseases’, Proceedings of Workshop on Applications of Pattern Analysis, Cumberland Lodge, Windsor, UK, pp. 142-151.
  73. Kannan, SS & Ramaraj, N 2010, ‘A novel hybrid feature selection via Symmetrical Uncertainty ranking based local memetic search algorithm’, Knowledge-Based Systems, vol. 23, no. 6, pp.580-585.
  74. Xie, J, Lei, J, Xie, W, Shi, Y & Liu, X 2013, ‘Two-stage hybrid feature selection algorithms for diagnosing erythemato-squamous diseases’, Health Information Science and Systems, vol.1, no.10, pp.2-14.
  75. Naseriparsa, M, Bidgoli, AM & Varaee, T 2013, ‘A Hybrid Feature Selection method to improve performance of a group of classification algorithms’, International Journal of Computer Applications, vol. 69, no. 17, pp. 0975 – 8887.
  76. Huda, S, Yearwood, J & Stranieri, A 2011, ‘Hybrid wrapper-filter approaches for input feature selection using maximum relevance-minimum redundancy and artificial neural network input gain measurement approximation (ANNIGMA)’, Proceedings of the Thirty-Fourth Australasian Computer Science Conference, Australia, vol. 113, pp. 43-52.
  77. Gunal, S 2012, ‘Hybrid feature selection for text classification’, Turkish Journal of Electrical Engineering and Computer Sciences, vol. 20, no.2, pp.1296-1311.
  78. D'Alessandro, M, Esteller, R, Vachtsevanos, G, Hinson, A, Echauz, J & Litt, B 2003, ‘Epileptic seizure prediction using hybrid feature selection over multiple intracranial EEG electrode contacts: a report of four patients’, IEEE Transactions on Biomedical Engineering, vol. 50, no.5, pp.603-615.
  79. Yang CS, Chuang LY, Ke CH, & Yang CH, 2008,’A hybrid feature selection method for microarray classification, IAENG International Journal of Computer Science, vol. 35, no. 3, pp. 1-3.
  80. Bermejo, P, Gámez, JA & Puerta, JM 2011, ‘A GRASP algorithm for fast hybrid (filter-wrapper) feature subset selection in high-dimensional datasets’, Pattern Recognition Letters, vol. 32, no.5, pp.701-711.
  81. Foithong, S, Pinngern, O & Attachoo, B 2012, ‘Feature subset selection wrapper based on mutual information and rough sets’, Expert Systems with Applications, vol. 39, no.1, pp.574-584.
  82. Coates A, Ng AY (2012) Learning Feature Representations with K-Means. In: Montavon, G., Orr, G.B., and Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. pp. 561–580. Springer Berlin Heidelberg.doi: 10.1007/978-3-642-35289-8_30
Index Terms

Computer Science
Information Sciences

Keywords

Introduction to variable and feature selection information gain-based feature selection gain ratio-based feature selection symmetric uncertainty-based feature selection subset-based feature selection ranking-based feature selection wrapper-based feature selection embedded-based feature selection filter-based feature selection hybrid feature selection selecting feature from high-dimensional data.