CFP last date
20 January 2025
Reseach Article

Efficiency of Multilayer Perceptron Neural Networks Powered by Multi-Verse Optimizer

by Abdullah M. Shoeb, Mohammad F. Hassanin
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 178 - Number 42
Year of Publication: 2019
Authors: Abdullah M. Shoeb, Mohammad F. Hassanin
10.5120/ijca2019919340

Abdullah M. Shoeb, Mohammad F. Hassanin . Efficiency of Multilayer Perceptron Neural Networks Powered by Multi-Verse Optimizer. International Journal of Computer Applications. 178, 42 ( Aug 2019), 48-55. DOI=10.5120/ijca2019919340

@article{ 10.5120/ijca2019919340,
author = { Abdullah M. Shoeb, Mohammad F. Hassanin },
title = { Efficiency of Multilayer Perceptron Neural Networks Powered by Multi-Verse Optimizer },
journal = { International Journal of Computer Applications },
issue_date = { Aug 2019 },
volume = { 178 },
number = { 42 },
month = { Aug },
year = { 2019 },
issn = { 0975-8887 },
pages = { 48-55 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume178/number42/30821-2019919340/ },
doi = { 10.5120/ijca2019919340 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:52:55.917054+05:30
%A Abdullah M. Shoeb
%A Mohammad F. Hassanin
%T Efficiency of Multilayer Perceptron Neural Networks Powered by Multi-Verse Optimizer
%J International Journal of Computer Applications
%@ 0975-8887
%V 178
%N 42
%P 48-55
%D 2019
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The models of artificial neural networks are applied to find solutions to many problems because of their computational power. The paradigm of multi-layer perceptron (MLP) is widely used. MLP must be trained before using. The training phase represents an obstacle in the formation of the solution model. Back-propagation algorithm, among of other approaches, has been used for training. The disadvantage of Back-propagation is the possibility of falling in local minimum of the training error instead of reaching the global minimum. Recently, many metaheuristic methods were developed to overcome this problem. In this work, an approach to train MLP by Multi-Verse Optimizer (MVO) was proposed. Implementing this approach on seven datasets and comparing the obtained results with six other metaheuristic techniques shows that MVO exceeds the other competitors to train MLP.

References
  1. Kůrková, V. 1992. Kolmogorovs theorem and multilayer neural networks. Neural Networks, 5(3), 501–506.
  2. Mirjalili, S. 2015. How effective is the Grey Wolf optimizer in training multi-layer perceptrons. Applied Intelligence, 43(1), 150–161.
  3. Hassanin, M. F., Shoeb, A. M. and Hassanien, A.E., 2016. Grey wolf optimizer-based back-propagation neural network algorithm. In 2016 12th International Computer Engineering Conference (ICENCO) (pp. 213-218). IEEE.
  4. Prasad, C., Mohanty, S., Naik, B., Nayak, J., and Behera, H. S. 2015. An efficient PSO-GA based back propagation learning-MLP (PSO-GA-BP-MLP) for classification. In Computational Intelligence in Data Mining (vol. 1, pp. 517-527). Springer India.
  5. Das, G., Pattnaik, P. K., and Padhy, S. K. 2014. Artificial Neural Network trained by Particle Swarm Optimization for non-linear channel equalization. Expert Systems with Applications, 41(7), 3491–3496.
  6. Leung, F. H. F., Lam, H. K., Ling, S. H., and Tam, P. K. S. 2003. Tuning of the structure and parameters of a neural network using an improved genetic algorithm. IEEE Transactions on Neural Networks, 14(1), 79–88.
  7. Mirjalili, S. 2015. The ant lion optimizer. Advances in Engineering Software, 83, 80–98.
  8. Nawi, N. M., Khan, A., and Rehman, M. Z. 2013. A new back-propagation neural network optimized with cuckoo search algorithm. In International Conference on Computational Science and Its Applications (pp. 413-426). Springer Berlin Heidelberg.
  9. Nawi, N. M., Khan, A., Rehman, M. Z., Herawan, T., and Deris, M. M. 2014. Comparing performances of Cuckoo Search based Neural Networks. In Recent Advances on Soft Computing and Data Mining (pp. 163–172). Springer International Publishing.
  10. Beyer, H. G., and Schwefel, H. P. 2002. Evolution strategies – A comprehensive introduction. Natural Computing, 1(1), 3–52.
  11. Pereira, L. A., Rodrigues, D., Ribeiro, P. B., Papa, J. P., and Weber, S. A. 2014. Social-Spider Optimization-Based Artificial Neural Networks Training and Its Applications for Parkinson’s Disease Identification. In 2014 IEEE 27th International Symposium on Computer-Based Medical Systems (pp. 14-17). IEEE.
  12. Mirjalili, S., Mirjalili, S. M., and Hatamlou, A. 2016. Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural Computing & Applications, 27(2), 495–513.
  13. Blake, C., and Merz, C. J. 1998. {UCI} Repository of machine learning databases. Academic Press.
  14. Mirjalili, S., Mirjalili, S. M., and Lewis, A. 2014. Let a biogeography-based optimizer train your multi-layer perceptron. Information Sciences, 269, 188–209.
Index Terms

Computer Science
Information Sciences

Keywords

Training neural network back propagation multi-verse optimizer classification.