CFP last date
20 January 2025
Reseach Article

Enhancing Recommendations of Items by Making Some Changes in Layers of BERT Model

by Ashima Malik, S. Srinivasan, Piyush Prakash
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 186 - Number 3
Year of Publication: 2024
Authors: Ashima Malik, S. Srinivasan, Piyush Prakash
10.5120/ijca2024923366

Ashima Malik, S. Srinivasan, Piyush Prakash . Enhancing Recommendations of Items by Making Some Changes in Layers of BERT Model. International Journal of Computer Applications. 186, 3 ( Jan 2024), 28-36. DOI=10.5120/ijca2024923366

@article{ 10.5120/ijca2024923366,
author = { Ashima Malik, S. Srinivasan, Piyush Prakash },
title = { Enhancing Recommendations of Items by Making Some Changes in Layers of BERT Model },
journal = { International Journal of Computer Applications },
issue_date = { Jan 2024 },
volume = { 186 },
number = { 3 },
month = { Jan },
year = { 2024 },
issn = { 0975-8887 },
pages = { 28-36 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume186/number3/33054-2024923366/ },
doi = { 10.5120/ijca2024923366 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T01:29:38.119837+05:30
%A Ashima Malik
%A S. Srinivasan
%A Piyush Prakash
%T Enhancing Recommendations of Items by Making Some Changes in Layers of BERT Model
%J International Journal of Computer Applications
%@ 0975-8887
%V 186
%N 3
%P 28-36
%D 2024
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Building upon an innovative methodology for modeling user behavior sequences in recommendation systems, this paper enhances the BERT4Rec model by incorporating advanced data preprocessing techniques, sentiment analysis, and optimized embedding layers. The original BERT4Rec model utilizes a bidirectional self-attention network and Cloze task, inspired by Bidirectional Encoder Representations from Transformers (BERT), to enhance product recommendations on e-commerce websites. While traditional unidirectional recommendation system models have limitations, such as the power of hidden representations and rigid ordering of historical user interactions, the bidirectional BERT4Rec model offers context from both directions. This paper improves upon these foundations by integrating a fine-tuned BERT Sentiment Model to filter reviews and a cosine similarity module to enhance collaborative filtering. Comprehensive experiments on Amazon review datasets demonstrate that our enhanced model achieves a recommendation accuracy of 93.2%, significantly outperforming the original 86%. These improvements establish a new benchmark for recommendation systems and pave the way for future research in explicit user modelling and incorporating item features.

References
  1. Akhtyamova, L., 2020, April. Named entity recognition in Spanish biomedical literature: Short review and BERT model. In 2020 26th Conference of Open Innovations Association (FRUCT) (pp.1-7). IEEE. https://arrow.tudublin.ie/cgi/viewcontent.cgi?article=101 6&context=ittscicon
  2. Balázs Hidasi and Alexandros Karatzoglou. 2018. Recurrent Neural Networks with Top-k Gains for Session- based Recommendations. In Proceedings of CIKM. ACM, New York, NY, USA, 843–852.
  3. Ciniselli, M., Cooper, N., Pascarella, L., Poshyvanyk, D., Di Penta, M. and Bavota, G., 2021, May. An empirical study on the usage of BERT models for code completion. In 2021 IEEE/ACM 18th International Conference on Mining Software Repositories (MSR) (pp. 108-119). IEEE. https://arxiv.org/pdf/2103.07115
  4. F. Maxwell Harper and Joseph A. Konstan. 2015. The MovieLens Datasets: History and Context. ACM Trans. Interact. Intell. Syst. 5, 4, Article 19 (Dec. 2015), 19 pages.
  5. Geoffrey Hinton, Oriol Vinyals, and Jeff Dean. 2015. Distilling the knowledge in a neural network. In Deep Learning and Representation Learning Workshop.
  6. Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep Residual Learning for Image Recognition. In Proceedings of CVPR. 770–778.
  7. Lee, J.S. and Hsiang, J., 2019. Patentbert: Patent classification with fine-tuning a pre-trained bert model. arXiv preprint arXiv:1906.02124. https://arxiv.org/pdf/1906.02124
  8. Lin, J., Liu, Y., Zeng, Q., Jiang, M. and Cleland-Huang, J., 2021, May. Traceability transformed: Generating more accurate links with pre-trained bert models. In 2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE) (pp. 324-335). IEEE. https://arxiv.org/pdf/2102.04411
  9. Lu, W., Jiao, J. and Zhang, R., 2020, October. Twinbert: Distilling knowledge to twin-structured compressed bert models for large-scale retrieval. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management (pp.2645-2652). https://arxiv.org/pdf/2002.06275
  10. Mozafari, M., Farahbakhsh, R. and Crespi, N., 2020. Hate speech detection and racial bias mitigation in social media based on BERT model. PloS one, 15(8), p.e0237861. https://doi.org/10.1371/journal.pone.0237861
  11. Nagy, A., Bial, B. and Ács, J., 2021. Automatic punctuation restoration with BERT models. arXiv preprint arXiv:2101.07343. https://arxiv.org/pdf/2101.07343
  12. Nozza, D., Bianchi, F. and Hovy, D., 2020. What the [mask]? making sense of language-specific BERT models. arXiv preprint arXiv:2003.02912. https://arxiv.org/pdf/2003.02912
  13. Petrov, A. and Macdonald, C., 2022, September. A Systematic Review and Replicability Study of BERT4Rec for Sequential Recommendation. In Proceedings of the 16th ACM Conference on Recommender Systems (pp. 436-447). https://arxiv.org/pdf/2207.07483
  14. Qiao, Y., Zhu, X. and Gong, H., 2022. BERT-Kcr: prediction of lysine crotonylation sites by a transfer learning method with pre-trained BERT models.Bioinformatics, 38(3), pp.648-654. http://structpred.life.tsinghua.edu.cn/pdf/10.1093_bioinfo rmatics_btab712.pdf
  15. Risch, J. and Krestel, R., 2020, May. Bagging BERT models for robust aggression identification. In Proceedings of the Second Workshop on Trolling, Aggression and Cyberbullying (pp. 55-61). https://aclanthology.org/2020.trac-1.9.pdf
  16. Rogers, A., Kovaleva, O. and Rumshisky, A., 2021. A primer in BERTology: What we know about how BERT works. Transactions of the Association for Computational Linguistics, 8, pp.842-866. https://direct.mit.edu/tacl/article-pdf/doi/10.1162/tacl_a_00349/1923281/tacl_a_00349.pd f
  17. Ruining He and Julian McAuley. 2016. Fusing Similarity Models with Markov Chains for Sparse Sequential Recommendation. In Proceedings of ICDM. 191–200.
  18. Ruining He, Wang-Cheng Kang, and Julian McAuley. 2017. Translation-based Recommendation. In Proceedings of RecSys. ACM, New York, NY, USA, 161–169.
  19. Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long Short-Term Memory. Neural Computation 9, 8 (Nov. 1997), 1735–1780.
  20. Shi, P. and Lin, J., 2019. Simple bert models for relation extraction and semantic role labeling. arXiv preprint arXiv:1904.05255. https://arxiv.org/pdf/1904.05255
  21. Sun, F., Liu, J., Wu, J., Pei, C., Lin, X., Ou, W. and Jiang, P., 2019, November. BERT4Rec: Sequential recommendation with bidirectional encoder representations from transformer. In Proceedings of the 28th ACM international conference on information and knowledge management (pp. 1441-1450). https://arxiv.org/pdf/1904.06690.pdf%EF%BC%89
  22. Sun, S., Cheng, Y., Gan, Z. and Liu, J., 2019. Patient knowledge distillation for bert model compression. arXiv preprint arXiv:1908.09355. https://arxiv.org/pdf/1908.09355
  23. Tsai, H., Riesa, J., Johnson, M., Arivazhagan, N., Li, X. and Archer, A., 2019. Small and practical BERT models for sequence labeling. arXiv preprint arXiv:1909.00100. https://arxiv.org/pdf/1909.00100
  24. Wang, Z., Ng, P., Ma, X., Nallapati, R. and Xiang, B., 2019. Multi-passage bert: A globally normalized bert model for open-domain question answering. arXiv preprint arXiv:1908.08167. https://arxiv.org/pdf/1908.08167.pdf)
  25. Xiangnan He, Lizi Liao, Hanwang Zhang, Liqiang Nie, Xia Hu, and Tat-Seng Chua. 2017. Neural Collaborative Filtering. In Proceedings of WWW. 173–182.
  26. Fei Sun, Jun Liu, Jian Wu, Changhua Pei, Xiao Lin, Wenwu Ou and Peng Jiang, 2019, August. BERT4Rec: Sequential Recommendation with Bidirectional Encoder Representations from Transformer. In Proceedings of the 28th ACM International Conference on Information and Knowledge Management, Beiging China, 1441-1450.
Index Terms

Computer Science
Information Sciences

Keywords

Product Recommendation BERT BERT4Rec SAS.