English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 118786/149850 (79%)
Visitors : 81602410      Online Users : 41
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/159298


    Title: 由交易紀錄建立客戶輪廓以進行大型語言模型之少樣本商品推薦
    Leveraging User Personas from Transactions for Few-shot Item Recommendation with Large Language Models
    Authors: 張祐誠
    Chang, Yu-Cheng
    Contributors: 沈錳坤
    Shan, Man-Kwan
    張祐誠
    Chang, Yu-Cheng
    Keywords: 大型語言模型
    客戶輪廓
    推薦系統
    直接推薦
    少樣本學習
    Large Language Models
    Customer Persona
    Recommendation System
    Direct Recommendation
    Few-shot Learning
    Date: 2025
    Issue Date: 2025-09-01 16:19:48 (UTC+8)
    Reference: [1] Tom Brown, Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared D Kaplan, et al., Language Models are Few-Shot Learners, Advances in Neural Information Processing Systems, 2020
    [2] Yashar Deldjoo, Zhankui He, Julian J. McAuley, Anton Korikov, Scott Sanner, et al., A Review of Modern Recommender Systems Using Generative Models (Gen-RecSys), In Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2024
    [3] Wei Jin, Haitao Mao, Zheng Li, Haoming Jiang, Chen Luo, et al., Amazon-M2: A Multilingual Multi-locale Shopping Session Dataset for Recommendation and Text Generation, In Proceedings of the 37th International Conference on Neural Information Processing Systems, 2023
    [4] Jinhyuk Lee, Feiyang Chen, Sahil Dua, Daniel Cer, Madhuri Shanbhogue, et al., Gemini Embedding: Generalizable Embeddings from Gemini, CoRR, 2025
    [5] Jianghao Lin, Xinyi Dai, Yunjia Xi, Weiwen Liu, Bo Chen, et al., How Can Recommender Systems Benefit from Large Language Models: A Survey, ACM Transactions on Information Systems, Vol. 43, No. 2, 2025
    [6] Dong-Ho Lee, Adam Kraft, Long Jin, Nikhil Mehta, Taibai Xu, et al., STAR: A Simple Training-free Approach for Recommendations using Large Language Models, CoRR, 2024
    [7] Junling Liu, Chao Liu, Peilin Zhou, Qichen Ye, Dading Chong, et al., LLMRec: Benchmarking Large Language Models on Recommendation Task, CoRR, 2023
    [8] Junling Liu, Chao Liu, Peilin Zhou, Renjie Lv, Kang Zhou, et al., Is ChatGPT a Good Recommender? A Preliminary Study, In Proceedings of the 1st Workshop on Recommendation with Generative Models, co-located with the 32nd ACM International Conference on Information and Knowledge Management, 2023
    [9] Michael Xieyang Liu, Frederick Liu, Alexander J. Fiannaca, Terry Koo, Lucas Dixon, et al., We Need Structured Output: Towards User-centered Constraints on Large Language Model Output, Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, 2024
    [10] Sebastian Lubos, Thi Ngoc Trang Tran, Alexander Felfernig, Seda Polat Erdeniz, and Viet-Man Le, LLM-generated Explanations for Recommender Systems, Adjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization, 2024
    [11] Yueqing Liang, Liangwei Yang, Chen Wang, Xiongxiao Xu, Philip S. Yu, et al., Taxonomy-Guided Zero-Shot Recommendations with LLMs, In Proceedings of the 31st International Conference on Computational Linguistics, 2025
    [12] Humza Naveed, Asad Ullah Khan, Shi Qiu, Muhammad Saqib, Saeed Anwar, et al., A Comprehensive Overview of Large Language Models, ACM Transactions on Intelligent Systems and Technology, 2025
    [13] Wenqi Sun, Ruobing Xie, Junjie Zhang, Wayne Xin Zhao, Leyu Lin, et al., Generative Next-Basket Recommendation, In Proceedings of the 17th ACM Conference on Recommender Systems, 2023
    [14] Lei Li, Yongfeng Zhang, Dugang Liu, and Li Chen, Large Language Models for Generative Recommendation: A Survey and Visionary Discussions, Joint 30th International Conference on Computational Linguistics and 14th International Conference on Language Resources and Evaluation, 2024
    [15] Joni Salminen, Chang Liu, Wenjing Pian, Jianxing Chi, Essi Häyhänen, et al., Deus Ex Machina and Personas from Large Language Models: Investigating the Composition of AI-Generated Persona Descriptions, In Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems, 2024
    [16] Zhufeng Shao, Shoujin Wang, Qian Zhang, Wenpeng Lu, Zhao Li, An Empirical Study of Next-basket Recommendations, CoRR, 2023
    [17] Hanbing Wang, Xiaorui Liu, Wenqi Fan, Xiangyu Zhao, Venkataramana Kini, et al., Rethinking Large Language Model Architectures for Sequential Recommendations, CoRR, 2024
    [18] Likang Wu, Zhi Zheng, Zhaopeng Qiu, Hao Wang, Hongchao Gu, et al., A Survey on Large Language Models for Recommendation, World Wide Web, Vol. 27, No. 5, 2024
    [19] Shuyuan Xu, Wenyue Hua, and Yongfeng Zhang, OpenP5: An Open-Source Platform for Developing, Training, and Evaluating LLM-based Recommender Systems, In Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2024
    [20] Fan Yang, Zheng Chen, Ziyan Jiang, Eunah Cho, Xiaojiang Huang, et al., PALR: Personalization Aware LLMs for Recommendation, arXiv:2305.07622, 2023
    [21] Joyce Zhou, Yijia Dai, and Thorsten Joachims, Language-Based User Profiles for Recommendation, CoRR, 2024
    Description: 碩士
    國立政治大學
    資訊科學系碩士在職專班
    112971013
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0112971013
    Data Type: thesis
    Appears in Collections:[資訊科學系碩士在職專班] 學位論文

    Files in This Item:

    File SizeFormat
    101301.pdf1789KbAdobe PDF0View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback