CFP last date
20 December 2024
Reseach Article

Weight Optimize by Automatic Unsupervised Clustering using Computation Intelligence

by C. Lowongtrakool, N. Hiransakolwong
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 50 - Number 21
Year of Publication: 2012
Authors: C. Lowongtrakool, N. Hiransakolwong
10.5120/7930-1261

C. Lowongtrakool, N. Hiransakolwong . Weight Optimize by Automatic Unsupervised Clustering using Computation Intelligence. International Journal of Computer Applications. 50, 21 ( July 2012), 37-41. DOI=10.5120/7930-1261

@article{ 10.5120/7930-1261,
author = { C. Lowongtrakool, N. Hiransakolwong },
title = { Weight Optimize by Automatic Unsupervised Clustering using Computation Intelligence },
journal = { International Journal of Computer Applications },
issue_date = { July 2012 },
volume = { 50 },
number = { 21 },
month = { July },
year = { 2012 },
issn = { 0975-8887 },
pages = { 37-41 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume50/number21/7930-1261/ },
doi = { 10.5120/7930-1261 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:48:55.755888+05:30
%A C. Lowongtrakool
%A N. Hiransakolwong
%T Weight Optimize by Automatic Unsupervised Clustering using Computation Intelligence
%J International Journal of Computer Applications
%@ 0975-8887
%V 50
%N 21
%P 37-41
%D 2012
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Several techniques are applied to the unsupervised clustering data analysis. The entered data is dataset of input without aclass of answer. Besides, the beginning weight and the values of cluster groups of answers are defined. However, the most important parameter among these three factors (unsupervised clustering, weight, and the number of clusters) is the determination of beginning weight for the system. If the weight is well determined after the starting point, the system will be able to calculate track and figure out the answers more rapidly and precisely. Therefore, this paper proposes the method to optimize the weight of the system by conducting a technique of computational intelligence to manage the unsupervised clustering data analysis. The experiment starts from finding the value of beginning weight and then it is processed later by using sample datasets from UCI Machine Learning Repository including iris, balance and wine. The result shows that the efficiency of data classification increases to 99. 3%, 83. 6% and 47. 0%, respectively, and finding automatically the initial number of cluster k. Consequently, the outcome reduces the number of predicting clusters to discover approximate answer as well.

References
  1. A. Abraham, Meta learning evolutionary artificial neural networks, Neurocomputing, 2004, Vol. 56, pp. 1–38.
  2. A. J. Al-Shareef and M. F. Abbod, Neural networks initial weights optimisation, in Proceedings of the 12th International Conference on Modelling and Simulation (UKSim '10), 2010, pp. 57–61.
  3. A. Patrikainen and M. Meila, Comparing subspace clusterings, IEEE Transactions on Knowledge and Data Engineering, 2006, 18(7),pp. 902–916.
  4. Center for Machine Learning and Intelligent Systems, UCI Machine Learning Repository(2011),http://archive. ics. uci. edu/ml/.
  5. C. zhang, H. Shao, Y. Li,Particle swarm optimization for evolving artificial neural network, IEEE Intl. Conf. on Systems2000, Vol. 4,pp. 2487 – 2490.
  6. E. Atashpaz Gargari, C. Lucas,Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition, IEEE Congress on Evolutionary Computation, 2007,pp. 4661-4667.
  7. S. -J. Han, S. -B. Cho,Evolutionary neural networks for anomaly detection based on the behavior of a program, IEEE Trans. Systems,2006,Vol. 36,pp. 559-570.
  8. E. Muller, S. Gunnemann, I. Assent, and T. Seidl. Evaluating clustering in subspace projections of high dimensional data. PVLDB, 2009, 2(1),pp. 1270–1281.
  9. G. Yen and P. Meesad,Pattern classification by an incremental learning fuzzy neural network, in Proc. IJCNN, 1999, pp. 3230-3235.
  10. G. Yen and P. Meesad, Constructing a fuzzy expert system using the ILFN network and the genetic algorithm, in Proc. IEEE Inter. Con. Syst. ManCybern. , 2000, pp. 1917-1922.
  11. A. Kizilay. and Makal. S,A neural network solution for identification and classification of cylindrical targets above perfectly conducting flat surfaces, J. of Electromagn. Waves and Appl. 21(2007)2147–2156, doi:10. 1163/156939307783152759
  12. T. Kohonen and P. Somervuo ,Self-Organizing Maps of Symbol Strings with Application to Speech Recognition, Neurocomputing, 1998,Vol. 21, No. 1-3, pp. 19-30.
  13. K. Miao, F. Chen and Z. G. Zhao, Stock price forecast based on bacterial colony RBF neural network, J. QingDao University. 20 (2007)50–54,doi: CNKI:SUN:QDDD. 0. 2007-02-011.
  14. M. C. P. de Souto, A. Yamazaki and T. B. Ludernir, Optimization of neural network weights and architecture for odor recognition using simulated annealing, in Proc. 2002 Intl. Joint Conf. on Neural Networks, 2002,Vol. 1, pp. 547–552.
  15. M. A. Mohamed, E. A. Soliman, and M. A. El-Gamal, Optimization and characterization of electromagnetically coupled patch an-tennas using RBF neural networks, J. of Electromagn. Waves and Appl. 20 (2006)1101–1114.
  16. M. Yang, D. Kriegman, and N. Ahuja,Detecting Faces in Images: A Survey,IEEE Transactions on Pattern Analysis and Machine Intelligence, 2002,vo1. 24, no. 1, pp. 34-58.
  17. R. Poli, J. Kennedy, and T. Blackwell, Particle swarm optimization, Swarm Intelligence, 2007,vol. 1, pp. 33–37.
  18. T. Kohonen, Self-Organizing Maps, Springer Verlag( Berlin, 2001).
  19. W. M. Jenkins,Neural network weight training by mutation, J. Computers and Structures. 84 (2006) 2107-2112,doi: 10. 1016/j. compstruc. 2006. 08. 066
  20. X. He, J. Zeng, J. Jie, Artificial neural network weights optimization design based on MEC algorithm,Conf. on Machine Learning and Cybernetics, 2004,Vol. 6,pp. 3361 – 3364.
  21. Y. Lee, S. H. Oh, and M. W. Kim,The effect of initial weights on premature saturation in back-propagation learning, in Proceedings of the International Joint Conference on Neural Networks,1991, pp. 765–770.
Index Terms

Computer Science
Information Sciences

Keywords

Weight Optimize Unsupervised Clustering Computation Intelligence