Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/36929
|
Title: | 兩種正則化方法用於假設檢定與判別分析時之比較 A comparison between two regularization methods for discriminant analysis and hypothesis testing |
Authors: | 李登曜 Li, Deng-Yao |
Contributors: | 黃子銘 Huang, Tzee-Ming 李登曜 Li, Deng-Yao |
Keywords: | 脊迴歸 正則化 交叉驗證 排列檢定 概似比檢定 判別分析 Ridge regression Regularization Cross-validation Permutation test Likelihood ration test Discriminant analysis |
Date: | 2008 |
Issue Date: | 2009-09-18 20:10:59 (UTC+8) |
Abstract: | 在統計學上,高維度常造成許多分析上的問題,如進行多變量迴歸的假設檢定時,當樣本個數小於樣本維度時,其樣本共變異數矩陣之反矩陣不存在,使得檢定無法進行,本文研究動機即為在進行兩群多維常態母體的平均數檢定時,所遇到的高維度問題,並引發在分類上的研究,試圖尋找解決方法。本文研究目的為在兩種不同的正則化方法中,比較何者在檢定與分類上表現較佳。本文研究方法為以 Warton 與 Friedman 的正則化方法來分別進行檢定與分類上的分析,根據其檢定力與分類錯誤的表現來判斷何者較佳。由分析結果可知,兩種正則化方法並沒有絕對的優劣,須視母體各項假設而定。 High dimensionality causes many problems in statistical analysis. For instance, consider the testing of hypotheses about multivariate regression models. Suppose that the dimension of the multivariate response is larger than the number of observations, then the sample covariance matrix is not invertible. Since the inverse of the sample covariance matrix is often needed when computing the usual likelihood ratio test statistic (under normality), the matrix singularity makes it difficult to implement the test . The singularity of the sample covariance matrix is also a problem in classification when the linear discriminant analysis (LDA) or the quadratic discriminant analysis (QDA) is used.
Different regularization methods have been proposed to deal with the singularity of the sample covariance matrix for different purposes. Warton (2008) proposed a regularization procedure for testing, and Friedman (1989) proposed a regularization procedure for classification. Is it true that Warton`s regularization works better for testing and Friedman`s regularization works better for classification? To answer this question, some simulation studies are conducted and the results are presented in this thesis. It is found that neither regularization method is superior to the other. |
Reference: | [1] M.J. Daniels and R.E. Kass. Shrinkage estimators for covariance matrices. Biometrics, 57(4):1173-1184,2001. [2] J.H. Friedman. Regularized discriminant analysis. Journal of the American Statistical Association, 84(405):165-175,1989. [3] J.P. Hoffbeck and D.A, Landgrebe. Covariance matrix estimator and classification with limited training data. IEEE Transactions on Pattern Analysis and Machine Intelligence, 18(7):763-767, 1996. [4] W.J. Krzanowski, P. Jonathan, W.V. McCarthy, and M.R. Thomas. Discriminant analysis with singular matrices:method and applications to spectroscopic data. Applied Statistics, 44(1):101-115, 1995. [5] D.M. Titterington. Common structure of smoothing techniques in statistics. International Statistical Review, 53(2):141-170, 1985. [6] D.I. Warton. Penalized normal likelihood and ridge regularization of correlation and covariance matrices. Journal of the American Statistical Association, 103(481):340-349, 2008. |
Description: | 碩士 國立政治大學 統計研究所 96354019 97 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0096354019 |
Data Type: | thesis |
Appears in Collections: | [統計學系] 學位論文
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|