CFP last date
20 January 2025
Reseach Article

Analysis of Randomized Performance of Bias Parameters and Activation Function of Extreme Learning Machine

by Prafull Pandey, Ram Govind Singh
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 135 - Number 1
Year of Publication: 2016
Authors: Prafull Pandey, Ram Govind Singh
10.5120/ijca2016908274

Prafull Pandey, Ram Govind Singh . Analysis of Randomized Performance of Bias Parameters and Activation Function of Extreme Learning Machine. International Journal of Computer Applications. 135, 1 ( February 2016), 23-28. DOI=10.5120/ijca2016908274

@article{ 10.5120/ijca2016908274,
author = { Prafull Pandey, Ram Govind Singh },
title = { Analysis of Randomized Performance of Bias Parameters and Activation Function of Extreme Learning Machine },
journal = { International Journal of Computer Applications },
issue_date = { February 2016 },
volume = { 135 },
number = { 1 },
month = { February },
year = { 2016 },
issn = { 0975-8887 },
pages = { 23-28 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume135/number1/24014-2016908274/ },
doi = { 10.5120/ijca2016908274 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T23:34:35.716053+05:30
%A Prafull Pandey
%A Ram Govind Singh
%T Analysis of Randomized Performance of Bias Parameters and Activation Function of Extreme Learning Machine
%J International Journal of Computer Applications
%@ 0975-8887
%V 135
%N 1
%P 23-28
%D 2016
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In Artificial Intelligence classification is a process of identifying classes of a different entities on the basis information provided from the dataset. Extreme Learning Machine (ELM) is one of the efficient classifiers. ELM is formed by interconnected layers. Each layer has many nodes (neurons). The input layer communicates with hidden layer with random weight and produces output layer with the help of activation function (transfer function). Activation functions are non-linear functions and different activation functions may produce different output on same dataset. Not every activation function is suited for every type classification problem. This paper shows the variation of average test accuracy with various activation functions. Along with it also has been shown that how much performance varied due to selection of random bias parameter between input and hidden layer of ELM.

References
  1. Guang-Bin Huang, Hongming Zhou, Xiaojian Ding,  Rui Zhang, “Extreme Learning Machine for Regression and Multiclass Classification”, IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 2011.
  2. S.Tamura, M.Tateishi, ”Capability and storage of four layered feedward neural network”, IEEE Transaction on Neural Network, 1997.
  3. Guang-Bin Haung, “Learning capability and storage capacity of two-hidden-layer feedforward networks”, IEEE Transaction on Neural Network, 2003.
  4. Guang-Bin Huang, Qin-Yu Zhu, Chee-Kheong Siew,” Extreme learning machine: Theory and applications”, Proceedings of International Joint Conference on Neural Networks, Budapest, Hungary,2004.
  5. Jim Y. F. Yam and Tommy W.S. Chow, “Feedforward networks training speed enhancement by optimal initialization of the synaptic coefficients”, IEEE Trans. On Neural Networks, vol.12, no.2, pp. 430-434, March2001.
  6. Karayiannis and A.N. Venetsanopoulos, “Artificial neural networks: learning algorithms” performance evaluation, and application”, Kluver Academic, pp.135-158, 1993.
  7. Y. LeCun, L.Bottou, G.B. Orr and K.R. Muller, “Efficient backprop” Lecture Notes in Computer Science, Vol.15, no. 24, pp.9-50,1998.
  8. P. Lingras, C. Butz , “Rough set based 1-v-1 and 1-v-r approaches to support vector machine multi-classification”, Information Science177 (18) (2007) 3783-3798.
  9. V. N. Vapnik, “The nature of Statistical Learning Theory,” New York: Spriger-Verlag, 1995.
  10. P.O. Duda, P.E. Hart, Pattern Classification and Scene Analysis , Wiley, New York, 1973
  11. S. Haykin, Neural Networks, A Comprehensive foundation, second ed, Pearson education Press, 2001.
  12. S.k. Kay, Fundamentals of Statistical Signal Processing: Detection Theory, 1st ed., Prentice Hall,1998.
  13. G Wang and P Li, “Dynamic Adaboost Ensemble Extreme Learning Machine”, 2010 3rd International Conference on Advanced Computer Theory and Engineering (ICACTE).
  14. Guang-Bin Haung, Babri H A, “Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation function”, IEEE Transaction on Nerual Network, 1998.
  15. Guang-Bin Haung, Q.Y. Zuhu, C.K. Siew, “Extreme Learning Machine: A New Learning Scheme of Feedward Neural Network”, 2004 International Join Conference on Neural Networks, Budapest Hungar, 2004.
  16. Guang-Bin Haung, Q.Y. Zuhu, C.K. Siew, “Extreme Learning Machine: Theory and Applications, Neurocomputing, 2006.
  17. S.S.Haykin,“Neural Networks and Learning Machine”, Prentice Hall, 2009
  18. Khaled Fawagreh, Mohamed Medhat Gaber, Eyad Elyan, “Random forests: from early developments to recent advancements” , Systems Science & Control Engineering: An Open Access Journal, 2014.
  19. David Williams, Xuejun Liao, Ya Xue, Lawrence Carin, “Incomplete-Data Classification using Logistic Regression”, 22nd International Conference on Machine Learning, Bonn, Germany, 2005.
  20. Włodzisław Duch, Norbert Jankowski, “Survey of Neural Transfer Functions”, NEURAL COMPUTING SURVEYS 2, 163-212, 1999.
  21. Anguita, D., Ghio, A., Greco, N., Oneto, L., Ridella, S.,“Model selection for support vector machines: Advantages and disadvantages of the Machine Learning Theory”, The 2010 International Joint Conference on Neural Networks (IJCNN), 2010.
  22. Erik Cambria, Guang-Bin Huang, ”Extreme Learning Machines”, IEEE Computer Society, 2013
Index Terms

Computer Science
Information Sciences

Keywords

Extreme machine learning feedforward network neural network classification