Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/110781
|
Title: | LASSO與其衍生方法之特性比較 Property comparison of LASSO and its derivative methods |
Authors: | 黃昭勳 Huang, Jau-Shiun |
Contributors: | 蔡政安 薛慧敏 Tsai, Chen-An Hsueh, Huey-Miin 黃昭勳 Huang, Jau-Shiun |
Keywords: | Elastic Net LASSO 懲罰函數 迴歸 變數篩選 Elastic Net LASSO Penalty function Regression Variable selection |
Date: | 2017 |
Issue Date: | 2017-07-11 11:25:28 (UTC+8) |
Abstract: | 本論文比較了幾種估計線性模型係數的方法,包括LASSO、Elastic Net、LAD-LASSO、EBLASSO和EBENet。有別於普通最小平方法,這些方法在估計模型係數的同時,能夠達到變數篩選,也就是刪除不重要的解釋變數,只將重要的變數保留在模型中。在現今大數據的時代,資料量有著愈來愈龐大的趨勢,其中不乏上百個甚至上千個解釋變數的資料,對於這樣的資料,變數篩選就顯得更加重要。本文主要目的為評估各種估計模型係數方法的特性與優劣,當中包含了兩種模擬研究與兩筆實際資料應用。由模擬的分析結果來看,每種估計方法都有不同的特性,沒有一種方法使用在所有資料都是最好的。 In this study, we compare several methods for estimating coefficients of linear models, including LASSO, Elastic Net, LAD-LASSO, EBLASSO and EBENet. These methods are different from Ordinary Least Square (OLS) because they allow estimation of coefficients and variable selection simultaneously. In other words, these methods eliminate non-important predictors and only important predictors remain in the model. In the age of big data, quantity of data has become larger and larger. A datum with hundreds of or thousands of predictors is also common. For this type of data, variable selection is apparently more essential. The primary goal of this article is to compare properties of different variable selection methods as well as to find which method best fits a large number of data. Two simulation scenarios and two real data applications are included in this study. By analyzing results from the simulation study, we can find that every method enjoys different characteristics, and no standard method can handle all kinds of data. |
Reference: | 黃書彬,攝護腺特異抗原(PSA)過高的意義??,上網日期106年5月17日,檢自http://www.kmuh.org.tw/www/kmcj/data/10306/11.htm 蔡政安,2009。《微陣列資料分析(Microarray Data Analysis)》。中國醫藥大學生物統計中心。 Cai, X., Huang, A. and Xu, S. (2011). Fast empirical Bayesian LASSO for multiple quantitative trait locus mapping. BMC Bioinformatics, 12, 211. Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression. Ann. Statist., 32, 407-499. Gao, X.L. and Huang, J. (2010). Asymptotic analysis of high-dimensional LAD regression with Lasso. Statistica Sinica, 20, 1485-1506. Gill, P., Murray, W. and Wright, M., (1981). Practical optimization. New York: Academic Press. Hoerl, A. and Kennard, R. (1988). Ridge regression. Encyclopedia of Statistical Sciences, 8, 129-136. Huang, A., Xu, S. and Cai, X. (2015). Empirical Bayesian elastic net for multiple quantitative trait locus mapping. Heredity, 114, 107-115. Tibshirani, R. (1996). Regression Shrinkage and Selection via the Lasso. J. R. Statist. Soc. B, 58, 267-288. Zou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net. J. R. Statist. Soc. B, 67, 301-320. |
Description: | 碩士 國立政治大學 統計學系 104354012 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0104354012 |
Data Type: | thesis |
Appears in Collections: | [統計學系] 學位論文
|
Files in This Item:
File |
Size | Format | |
401201.pdf | 1445Kb | Adobe PDF2 | 869 | View/Open |
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|