CFP last date
20 December 2024
Reseach Article

Analysis of Hybrid Neural Network for Improved Performance

by Zainab Khalid Awan, Aamir Khan, Anam Iftikhar, Sadia Zahid, Anam Malik
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 50 - Number 1
Year of Publication: 2012
Authors: Zainab Khalid Awan, Aamir Khan, Anam Iftikhar, Sadia Zahid, Anam Malik
10.5120/7733-0681

Zainab Khalid Awan, Aamir Khan, Anam Iftikhar, Sadia Zahid, Anam Malik . Analysis of Hybrid Neural Network for Improved Performance. International Journal of Computer Applications. 50, 1 ( July 2012), 8-17. DOI=10.5120/7733-0681

@article{ 10.5120/7733-0681,
author = { Zainab Khalid Awan, Aamir Khan, Anam Iftikhar, Sadia Zahid, Anam Malik },
title = { Analysis of Hybrid Neural Network for Improved Performance },
journal = { International Journal of Computer Applications },
issue_date = { July 2012 },
volume = { 50 },
number = { 1 },
month = { July },
year = { 2012 },
issn = { 0975-8887 },
pages = { 8-17 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume50/number1/7733-0681/ },
doi = { 10.5120/7733-0681 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:47:09.225094+05:30
%A Zainab Khalid Awan
%A Aamir Khan
%A Anam Iftikhar
%A Sadia Zahid
%A Anam Malik
%T Analysis of Hybrid Neural Network for Improved Performance
%J International Journal of Computer Applications
%@ 0975-8887
%V 50
%N 1
%P 8-17
%D 2012
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In this paper we take a close look at the Hybrid Neural Network Model. Hybrid model is attained by combining two Artificial Neural Networks (ANNs). In which the first model is used to perform the feature extraction task and the second one performs prediction task. This paper explores the classifying ability of the proposed hybrid model. We analyze the performance of the model, which is a compound characteristic, of which the prediction accuracy is the most important component. If the prediction accuracy of the model can be increased it will result into enhanced performance of the model. The model that has been built is under the umbrella of pattern recognition and incorporates some of the data mining techniques. Kernel Principal Component Analysis (KPCA) has been implemented in the pre-processing stage for easier subsequent analysis. By the end of the paper, the key factors that enhance the accuracy of the model have been identified and their role explained. It also has been shown that single ANN model's performance deteriorates on an unseen problem much more as compared to the hybrid model. The aim is to provide a model having better performance and accuracy. The paper focuses on the real world applications of the model.

References
  1. S. Wermter. Knowledge extraction from transducer neural networks. Applied Intelligence: The International Journal of Artificial Intelligence, Neural Networks, and Complex Problem-Solving Techniques, 12:27-42, 2000.
  2. O. Chapelle, B. SchÄolkopf, and A. Zien, editors. Semi-Supervised Learning. MIT Press, 2006.
  3. N. V. Chawla, L. O. Hall, K. W. Bowyer, and W. P. Kegelmeyer. Learning ensembles from bites: A scalable and accurate approach. Journal of Machine Learning Research, 5:421-451, 2004.
  4. H. Chen and X. Yao. When does diversity in classifier ensembles help generalization? Machine Learning, 2008. In Revise.
  5. H. Chen and X. Yao. Regularized negative correlation learning for neural network ensembles. IEEE Transactions on Neural Networks, 2009b.
  6. H. H. Dam, H. A. Abbass, C. Lokan, and X. Yao. Neural-based learning classifier systems. IEEE Transactions on Knowledge and Data Engineering, 20(1):26-39, 2008.
  7. R. Diaz-Uriarte and S. Andres. Gene selection and classification of microarray data using random forest. BMC Bioinformatics, 7(1):3, 2006.
  8. P. C. Pendharkar, Genetic algorithm based neural network approaches, Expert Systems with Applications, 6714-6720, 36, 2009.
  9. Hung, S. Y. , Yen, D. C. , & Wang, H. Y, Applying data mining to telecomm churn management. Expert Systems with Applications, 31(3), 515–524, 2006. i
  10. Tan, P. N. , Steinbach, M. , & Kumar, Introduction to data mining. Boston, MA: Addison Wesley Publishing, 2006.
  11. T. G. Dietterich. Ensemble methods in machine learning. Lecture Notes in Computer Science, 1857:1-15, 2000.
  12. N. Garcia, C. Hervas, and D. Ortiz. Cooperative coevolution of artificial neural network ensembles for pattern classification. IEEE Transactions on Evolutionary Computation, 9(3):271-302, 2005.
  13. G. Giacinto and F. Roli. Design of effective neural network ensembles for image classification purposes. Image and Vision Computing, 19(9-10):699-707, 2001.
  14. M. M. Islam, X. Yao, and K. Murase. A constructive algorithm for training cooperative neural network ensembles. IEEE Transaction on Neural Networks, 14(4):820-834, 2003.
  15. L. I. Kuncheva and C. J. Whitaker. Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Machine Learning, 51(2): 181-207, 2003.
  16. X. Liao, H. Li, and L. Carin. Quadratically gated mixture of experts for incomplete data classification. In ICML'07: Proceedings of the 24th international conference on Machine learning, 553-560, 2007.
  17. N. Ueda. Optimal linear combination of neural networks for improving classification performance. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(2):207-215, 2000.
  18. Z. Zhou, J. Wu, and W. Tang. Ensembling neural networks: many could be better than all. Artificial Intelligence, 137(1-2):239-263, 2002.
  19. S. Wermter and R. Sun: Hybrid Neural Systems, Springer-Verlag, 1778, 1-13, 2000.
  20. Rajeev Alur and George J. Pappas, editors. Hybrid Systems: Computation and Control, 7th International Workshop, HSCC 2004, Philadelphia, PA, USA, 25-27, 2004, Proceedings, volume 2993 of LNCS. Springer, 2004.
  21. A. van der Schaft and H. Schumacher. An Introduction to Hybrid Dynamical Systems. Springer, London, 2000.
  22. S. W. Al Sayegh. Hybrid Neural Networks, IEEE, IJCNN, 2006.
  23. Chiang, D. A. , Wang, Y. F. , Lee, S. L. , & Lin, C. J. Goal-oriented sequential pattern for network banking churn analysis. Expert Systems with Applications, 25, 293–302, 2003.
  24. Coussement, K. , & Van den Poel, D. Churn prediction in subscription services: An application of support vector machines while comparing two parameter-selection techniques. Expert Systems with Applications, 34, 313–327, 2008.
  25. Han, J. , & Kamber, M. Data Mining: Concepts and Techniques. Morgan Kaufmann, 2001.
  26. Laurene Fausett. Fudamentals of Neural Networks.
  27. Hung, C. , & Tsai, C. -F. Segmentation based on hierarchical self-organizing map for markets of multimedia on demand. Expert Systems with Applications, 34(1), 780–787, 2008.
  28. Hung, S. Y. , Yen, D. C. , & Wang, H. Y. Applying data mining to telecom churn management. Expert Systems with Applications, 31, 515–524, 2006.
  29. Kim, H. S. , & Yoon, C. H. Determinants of subscriber churn and customer loyalty in the Korean mobile telephony market. Telecommunications Policy, 28, 751–765, 2004.
  30. Kim, M. , Park, M. , & Jeong, D. The effects of customer satisfaction and switching barrier on customer loyalty in Korean mobile telecommunication services. Telecommunications Policy, 28, 145–159, 2004.
  31. Acir, N. A support vector machine classifier algorithm based on a perturbation method and its application to ECG beat recognition systems. Expert Systems with Applications, 31(1), 150–158, 2006.
  32. Hsu, C. -W. , Chang, C. -C. , & Lin, C. -J. (2004). A practical guide to support vector classification. Technical Report, Department of Computer Science and Information Engineering, National Taiwan University.
  33. Chin-Fong, Yu-Hsin. Customer Churn prediction by using Hybrid Neural Networks. Expert Systems with Applications. 36, 12547-12553, 2009.
  34. Li, C. T. , & Tan, Y. H. Adaptive control of system with hysteresis using neural networks. Journal of Systems Engineering and Electronics, 17, 163–167, 2006.
  35. Zhang, X. , Edwards, J. , & Harding, J. Personalised online sales using web usage data mining. Computers in Industry, 58, 772–782, 2007. University of Washington.
Index Terms

Computer Science
Information Sciences

Keywords

Unseen Test Set (UTS) Seen Test Set (STS)