CFP last date
21 April 2025
Call for Paper
May Edition
IJCA solicits high quality original research papers for the upcoming May edition of the journal. The last date of research paper submission is 21 April 2025

Submit your paper
Know more
Reseach Article

Enhancing E-Commerce Product Page Recommendations using Large Language Models

by Kailash Thiyagarajan
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 186 - Number 68
Year of Publication: 2025
Authors: Kailash Thiyagarajan
10.5120/ijca2025924529

Kailash Thiyagarajan . Enhancing E-Commerce Product Page Recommendations using Large Language Models. International Journal of Computer Applications. 186, 68 ( Feb 2025), 49-54. DOI=10.5120/ijca2025924529

@article{ 10.5120/ijca2025924529,
author = { Kailash Thiyagarajan },
title = { Enhancing E-Commerce Product Page Recommendations using Large Language Models },
journal = { International Journal of Computer Applications },
issue_date = { Feb 2025 },
volume = { 186 },
number = { 68 },
month = { Feb },
year = { 2025 },
issn = { 0975-8887 },
pages = { 49-54 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume186/number68/enhancing-e-commerce-product-page-recommendations-using-large-language-models/ },
doi = { 10.5120/ijca2025924529 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2025-02-25T22:58:09.091211+05:30
%A Kailash Thiyagarajan
%T Enhancing E-Commerce Product Page Recommendations using Large Language Models
%J International Journal of Computer Applications
%@ 0975-8887
%V 186
%N 68
%P 49-54
%D 2025
%I Foundation of Computer Science (FCS), NY, USA
Abstract

This paper outlines a comprehensive framework for integrating Large Language Models (LLMs) into e-commerce product page recommendation systems. This research begins by presenting the problem statement, highlighting how traditional recommendation approaches struggle to capture the rich semantic nuances embedded in diverse textual sources such as product descriptions, user reviews, and Q&A content. This study then review relevant advancements in recommender systems and natural language processing, setting the stage for our proposed solution. Our architecture features a transformer-based embedding pipeline that encodes text into meaningful representations, an LLM-driven recommendation core that enhances personalization and relevance, and real-time inference strategies to ensure low-latency responses. To provide a practical view, we walk through the end-to-end flow of a user query—from the moment it is entered to the final personalized product suggestions displayed. This study concludes with an analysis of key challenges such as scalability, bias, and privacy considerations, along with future directions to further optimize LLM-based recommendation systems for robust real-world performance.

References
  1. Xu, L., Zhang, J., Li, B., Wang, J., Cai, M., Zhao, W. X., and Wen, J.-R. (2024). Prompting Large Language Models for Recommender Systems: A Comprehensive Framework and Empirical Analysis. arXiv preprint arXiv:2401.04997.
  2. Luo, S., Yao, Y., He, B., Huang, Y., Zhou, A., Zhang, X., Xiao, Y., Zhan, M., and Song, L. (2024). Integrating Large Language Models into Recommendation via Mutual Augmentation and Adaptive Aggregation. arXiv preprint arXiv:2401.13870.
  3. Lin, J., Dai, X., Shan, R., Chen, B., Tang, R., Yu, Y., and Zhang, W. (2024). Large Language Models Make Sample-Efficient Recommender Systems. arXiv preprint arXiv:2406.02368.
  4. Vats, A., Jain, V., Raja, R., and Chadha, A. (2024). Exploring the Impact of Large Language Models on Recommender Systems: An Extensive Review. arXiv preprint arXiv:2402.18590.
  5. Wu, L., Zheng, Z., Qiu, Z., Wang, H., Gu, H., Shen, T., Qin, C., Zhu, C., Zhu, H., Liu, Q., Xiong, H., and Chen, E. (2023). A Survey on Large Language Models for Recommendation. arXiv preprint arXiv:2305.19860.
  6. Wang, H., Zhang, Y., and Chang, E. Y. (2024). An Efficient All-round LLM-based Recommender System. arXiv preprint arXiv:2404.11343.
  7. [7] NVIDIA Merlin Team. (2023). Transformers4Rec: Bridging the Gap between NLP and Sequential Session-based Recommendation. GitHub Repository.
  8. Amazon Science. (2023). A Transformer-based Substitute Recommendation Model Incorporating Weakly Supervised Customer Behavior Data. Amazon Science Publications.
  9. TensorFlow Team. (2023, June 6). Augmenting Recommendation Systems with LLMs. TensorFlow Blog.
  10. Chang, E. Y. (2024). LLM Collaborative Intelligence: The Path to Artificial General Intelligence. AI Press.
  11. Zhu, Y., Wu, L., Guo, Q., Hong, L., and Li, J. (2023). Collaborative Large Language Model for Recommender Systems. arXiv preprint arXiv:2311.01343.
  12. Ren, X., Wei, W., Xia, L., Su, L., Cheng, S., Wang, J., Yin, D., and Huang, C. (2023). Representation Learning with Large Language Models for Recommendation. arXiv preprint arXiv:2310.15950.
  13. Hou, L., Liu, J., and Zhao, W. X. (2023). Large Language Models are Not Stable Recommender Systems. arXiv preprint arXiv:2312.15746.
  14. Silva, N., and Ribeiro, B. (2024). On Explaining Recommendations with Large Language Models. Frontiers in Big Data, 7, 1505284.
  15. Chang, E. Y. (2024). LLM Collaborative Intelligence: The Path to Artificial General Intelligence. AI Press.
Index Terms

Computer Science
Information Sciences
Recommender Systems
Natural Language Processing
Machine Learning
Retail Technology

Keywords

LLM Transformer Models Personalized E-commerce Product Recommendations Semantic Analysis