CFP last date
20 December 2024
Reseach Article

Similarity Learning in Many Core Architecture

by Arjeta Selimi-Rexha, Ali Mustafa Qamar
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 181 - Number 11
Year of Publication: 2018
Authors: Arjeta Selimi-Rexha, Ali Mustafa Qamar
10.5120/ijca2018917697

Arjeta Selimi-Rexha, Ali Mustafa Qamar . Similarity Learning in Many Core Architecture. International Journal of Computer Applications. 181, 11 ( Aug 2018), 1-5. DOI=10.5120/ijca2018917697

@article{ 10.5120/ijca2018917697,
author = { Arjeta Selimi-Rexha, Ali Mustafa Qamar },
title = { Similarity Learning in Many Core Architecture },
journal = { International Journal of Computer Applications },
issue_date = { Aug 2018 },
volume = { 181 },
number = { 11 },
month = { Aug },
year = { 2018 },
issn = { 0975-8887 },
pages = { 1-5 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume181/number11/29813-2018917697/ },
doi = { 10.5120/ijca2018917697 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T01:05:38.877927+05:30
%A Arjeta Selimi-Rexha
%A Ali Mustafa Qamar
%T Similarity Learning in Many Core Architecture
%J International Journal of Computer Applications
%@ 0975-8887
%V 181
%N 11
%P 1-5
%D 2018
%I Foundation of Computer Science (FCS), NY, USA
Abstract

A lot of recent research works have pointed out that metric learning is far better as compared to using default metrics such as Euclidean distance, cosine similarity, etc. Moreover, similarity learning based on cosine similarity has been proved to work better for many of the data sets, which are not necessarily textual in nature. Nevertheless, similarity learning in nearest neighbor algorithms has been inherently slow, owing to their O(d3) complexity. This short-coming is addressed in this research and a similarity learning algorithm for many core architectures is proposed; whereby, Similarity Learning Algorithm (SiLA) is parallelized. The resulting algorithm is faster than the traditional one on many data sets because of its parallel nature. The results are confirmed on UCI data sets.

References
  1. Aur´elien Bellet, Amaury Habrard, and Marc Sebban. Good edit similarity learning by loss minimization. Machine Learning, 89(1-2):5–35, 2012.
  2. Gal Chechik, Varun Sharma, Uri Shalit, and Samy Bengio. Large scale online learning of image similarity through ranking. Journal of Machine Learning Research, 11(Mar):1109– 1135, 2010.
  3. Javier Diaz, Camelia Munoz-Caro, and Alfonso Nino. A survey of parallel programming models and tools in the multi and many-core era. IEEE Transactions on parallel and distributed systems, 23(8):1369–1386, 2012.
  4. D Dua and E Karra Taniskidou. UCI machine learning repository [http://archive. ics. uci. edu/ml]. 2017.
  5. Yoav Freund and Robert E Schapire. Large margin classification using the perceptron algorithm. Machine learning, 37(3):277–296, 1999.
  6. Jared Hoberock and Nathan Bell. Thrust–parallel algorithms library. Available: thrust. github. io, 2015.
  7. Sofia Ira Ktena, Sarah Parisot, Enzo Ferrante, Martin Rajchl, Matthew Lee, Ben Glocker, and Daniel Rueckert. Distance metric learning using graph convolutional networks: Application to functional brain networks. In International Conference on Medical Image Computing and Computer-Assisted Intervention, pages 469–477. Springer, 2017.
  8. Quansheng Kuang and Lei Zhao. A practical gpu based knn algorithm. In Proceedings. The 2009 International Symposium on Computer Science and Computational Technology (ISCSCI 2009), page 151. Citeseer, 2009.
  9. Hamidreza Mohebbi, Yang Mu, and Wei Ding. Learning weighted distance metric from group level information and its parallel implementation. Applied Intelligence, 46(1):180– 196, 2017.
  10. Ali Mustafa Qamar. Generalized cosine and similarity metrics: a supervised learning approach based on nearest neighbors. PhD thesis, Universit´e de Grenoble, 2010.
  11. Ali Mustafa Qamar and Eric Gaussier. Online and batch learning of generalized cosine similarities. In Data Mining, 2009. ICDM’09. Ninth IEEE International Conference on, pages 926–931. IEEE, 2009.
  12. Ali Mustafa Qamar, Eric Gaussier, Jean-Pierre Chevallet, and Joo Hwee Lim. Similarity learning for nearest neighbor classification. In Data Mining, 2008. ICDM’08. Eighth IEEE International Conference on, pages 983–988. IEEE, 2008.
  13. Yuxin Su, Haiqin Yang, Irwin King, and Michael Lyu. Distributed information-theoretic metric learning in apache spark. In Neural Networks (IJCNN), 2016 International Joint Conference on, pages 3306–3313. IEEE, 2016.
  14. Kilian QWeinberger, John Blitzer, and Lawrence K Saul. Distance metric learning for large margin nearest neighbor classification. In Advances in neural information processing systems, pages 1473–1480, 2006.
  15. Eric P Xing, Michael I Jordan, Stuart J Russell, and Andrew Y Ng. Distance metric learning with application to clustering with side-information. In Advances in neural information processing systems, pages 521–528, 2003.
Index Terms

Computer Science
Information Sciences

Keywords

Similarity learning SiLA algorithm Parallel computing Supervised machine learning CUDA programming