Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/130955
|
Title: | 隨機梯度下降法對於順序迴歸模型估計之收斂研究及推薦系統應用 Convergence of Stochastic Gradient Descent for Ordinal Regression Model and Applications for Recommender Systems |
Authors: | 陳冠廷 Chen, Kuan-Ting |
Contributors: | 翁久幸 Weng, Chiu-Hsing 陳冠廷 Chen, Kuan-Ting |
Keywords: | 矩陣分解 順序迴歸 隨機梯度下降法 批次隨機梯度下降法 平均估計 Matrix Factorization Ordinal Regression Stochastic Gradient Descent Mini-Batch Stochastic Gradient Descent Average Estimate |
Date: | 2020 |
Issue Date: | 2020-08-03 17:31:12 (UTC+8) |
Abstract: | 矩陣分解是一種普及的協同過濾方法,Koren和Sill在2011年提出了基於順序迴歸的矩陣分解方法。相較於傳統的矩陣分解方法,由於基於順序迴歸的矩陣分解方法能夠輸出用戶對物品各項評分的出現機率,因此在應用方面上具有優勢。雖然他們的實驗在準確性上表現優異,但目前尚沒有開源的程式能夠使用。此次論文我們便應用隨機梯度下降法來實現此矩陣分解模型,並討論遭遇到的數值問題,由於此模型涉及順序迴歸模型,我們也研究了順序迴歸模型在隨機梯度下降法下,其參數估計的收斂。 Matrix factorization is a popular Collaborating Filtering (CF) method. Koren and Sill (2011) proposed an ordinal regression model with a matrix factorization CF method. This approach is advantageous over traditional matrix factorization methods by its ability to output a full probability distribution of the user-item ratings. Though their experiments showed superior results in its accuracy, there is no publicly available software. In this thesis, we implement the algorithms by Stochastic Gradient Descent (SGD) and discuss the numerical issues encountered. As this approach involves ordinal regression models, we will study the convergence of SGD for ordinal regression models as well. |
Reference: | [1] Léon Bottou, Frank E Curtis, and Jorge Nocedal. Optimization Methods for Large-Scale Machine Learning.Siam Review, 60(2):223–311, 2018. [2] Yixin Fang, Jinfeng Xu, and Lei Yang. Online Bootstrap Confidence Intervals for the Stochastic Gradient Descent Estimator.The Journal of Machine Learning Research, 19(1):3053–3073, 2018. [3] Simon Funk. Netflix Update: Try This at Home, 2006. [4] F Maxwell Harper and Joseph A Konstan. The Movielens Datasets: History and Context.ACM Transactions on Interactive Intelligent Systems (TIIS), 5(4):1–19,2015. [5] Jack Kiefer and Jacob Wolfowitz. Stochastic Estimation of The Maximum of ARegression Function.The Annals of Mathematical Statistics, 23(3):462–466, 1952. [6] Yehuda Koren. Factorization Meets the Neighborhood: A MultifacetedCollaborative Filtering Model. In Proceedings of the 14th ACM SIGKDDInternational Conference on Knowledge Discovery and Data Mining, pages 426–434, 2008. [7] Yehuda Koren, Robert Bell, and Chris Volinsky. Matrix Factorization Techniques for Recommender Systems.Computer, 42(8):30–37, 2009. [8] Yehuda Koren and Joe Sill. Ordrec: An Ordinal Model for Predicting PersonalizedItem Rating Distributions. In Proceedings of the 5th ACM Conference on Recommender Systems, pages 117–124, 2011. [9] Peter McCullagh. Regression Models for Ordinal Data.Journal of the RoyalStatistical Society: Series B (Methodological), 42(2):109–127, 1980. [10] Boris T Polyak and Anatoli B Juditsky. Acceleration of Stochastic Approximation by Averaging.SIAM Journal on Control and Optimization, 30(4):838–855, 1992. [11] Herbert Robbins and Sutton Monro. A Stochastic Approximation Method.TheAnnals of Mathematical Statistics, pages 400–407, 1951. [12] David Ruppert. Efficient Estimations from A Slowly Convergent Robbins-MonroProcess. Technical report, Cornell University Operations Research and IndustrialEngineering, 1988. |
Description: | 碩士 國立政治大學 統計學系 107354012 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0107354012 |
Data Type: | thesis |
DOI: | 10.6814/NCCU202000780 |
Appears in Collections: | [統計學系] 學位論文
|
Files in This Item:
File |
Description |
Size | Format | |
401201.pdf | | 441Kb | Adobe PDF2 | 58 | View/Open |
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|