CFP last date
20 January 2025
Reseach Article

Experimental Comparison of Methods for Multi-label Classification in different Application Domains

by Passent El Kafrawy, Amr Mausad, Heba Esmail
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 114 - Number 19
Year of Publication: 2015
Authors: Passent El Kafrawy, Amr Mausad, Heba Esmail
10.5120/20083-1666

Passent El Kafrawy, Amr Mausad, Heba Esmail . Experimental Comparison of Methods for Multi-label Classification in different Application Domains. International Journal of Computer Applications. 114, 19 ( March 2015), 1-9. DOI=10.5120/20083-1666

@article{ 10.5120/20083-1666,
author = { Passent El Kafrawy, Amr Mausad, Heba Esmail },
title = { Experimental Comparison of Methods for Multi-label Classification in different Application Domains },
journal = { International Journal of Computer Applications },
issue_date = { March 2015 },
volume = { 114 },
number = { 19 },
month = { March },
year = { 2015 },
issn = { 0975-8887 },
pages = { 1-9 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume114/number19/20083-1666/ },
doi = { 10.5120/20083-1666 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:53:14.161772+05:30
%A Passent El Kafrawy
%A Amr Mausad
%A Heba Esmail
%T Experimental Comparison of Methods for Multi-label Classification in different Application Domains
%J International Journal of Computer Applications
%@ 0975-8887
%V 114
%N 19
%P 1-9
%D 2015
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Real-world applications have begun to adopt the multi-label paradigm. The multi-label classification implies an extra dimension because each example might be associated with multiple labels (different possible classes), as opposed to a single class or label (binary, multi-class) classification. And with increasing number of possible multi-label applications in most ecosystems, there is little effort in comparing the different multi-label methods in different domains. Hence, there is need for a comprehensive overview of methods and metrics. In this study, we experimentally evaluate 11 methods for multi-label learning using 6 evaluation measures over seven benchmark datasets. The results of the experimental comparison revealed that the best performing method for both the example- based evaluation measures and the label-based evaluation measures are ECC on all measures when using C4. 5 tree classifier as a single-label base learner.

References
  1. G. Tsoumakas, I. Katakis, Multi label classification: an overview, International Journal of Data Warehouse and Mining 3 (3) (2007) 1–13.
  2. G. Tsoumakas, I. Vlahavas, Random k-labelsets: an ensemble method for multi-label classification, in: Proceedings of the 18th European conference on Machine Learning, 2007, pp. 406–417.
  3. J. Read, B. Pfahringer, G. Holmes, Multi-label classification using ensembles of pruned sets, in: Proceedings of the 8th IEEE International Conference on Data Mining, 2008, pp. 995–1000.
  4. J. F ¨ urnkranz, Round robin classification, Journal of Machine Learning Research 2(2002)721–747.
  5. T. -F. Wu, C. -J. Lin, R. C. Weng, Probability estimates for multi-class classification by pairwise coupling, Journal of Machine Learning Research 5 (2004) 975–1005.
  6. J. Read, B. Pfahringer, G. Holmes, E. Frank, Classifier chains for multi-label classification, in: Proceedings of the 20th European Conference on Machine Learning, 2009, pp. 254–269.
  7. J. Read, B. Pfahringer and G. Holmes, "Multi-label classification using ensembles of pruned sets", Proc 8th IEEE International Conference on Data Mining, Pisa, Italy, pages 995-1000. IEEE Computer Society, 2008.
  8. G. Tsoumakas, I. Katakis, I. Vlahavas, Effective and efficient multi-label classification in domains with large number of labels, in: Proceedings of the ECML/PKDD Workshop on Mining Multidimensional Data, 2008, pp. 30–44.
  9. S. -H. Park,J. F ¨ urnkranz, Efficient pairwise classification, in: Proceedings of the 18th European Conference on Machine Learning,2007,pp. 658–665.
  10. E. L. Menc?´a, S. -H. Park,J. F ¨ urnkranz, Efficient voting prediction for pairwise multi-label classification,Neurocomputing73(2010)1164–1176.
  11. A. Clare, R. D. King, Knowledge discovery in multi-label phenotype data, in: Proceedings of the 5th European Conference on PKDD, 2001, pp. 42–53.
  12. M. L. Zhang, Z. H. Zhou, Multi-label neural networks with applications to functional genomics and text categorization, IEEE Transactions on Knowledge and Data Engineering 18 (10) (2006) 1338–1351.
  13. R. E. Schapire, Y. Singer, Boostexter: a boosting-based system for text categorization, Machine Learning 39 (2000) 135–168.
  14. F. DeComite, R. Gilleron, M. Tommasi, Learning multi-label alternating decision trees from texts and data, in: Proceedings of the 3rd international conference on Machine learning and data mining in pattern recognition, 2003, pp. 35–49.
  15. M. L. Zhang, Z. H. Zhou, ML-kNN: a lazy learning approach to multi-label learning, Pattern Recognition 40 (7) (2007) 2038–2048.
  16. A. Elisseeff, J. Weston, A Kernel method for multi-labelled classification, in: Proceedings of the Annual ACM Conference on Research and Development in Information Retrieval, 2005, pp. 274–281.
  17. G. Tsoumakas, I. Katakis, I. Vlahavas, Mining multi-label data, in: Data Mining and Knowledge Discovery Handbook, Springer, Berlin/Heidelberg, 2010, pp. 667–685.
  18. I. H. Witten, ans E. Frank, "Data Mining: Practical Machine Learning tools and techniques", Morgan Kaufmann, 2005.
  19. G. Tsoumakas, R. Friberg, E. Spyromitros-Xiou, I, Kataks, and J. Vilcek, "Mulan software - java classes for multi-label classification Available at: http://mlkd. csd. auth. gr/multilabel. html#Software
  20. M. Friedman, A comparison of alternative tests of significance for the problem of m rankings, Annals of Mathematical Statistics 11 (1940) 86–92.
Index Terms

Computer Science
Information Sciences

Keywords

Multi-Label classification Multi-Label learning Data Mining