English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 113648/144635 (79%)
Visitors : 51677655      Online Users : 590
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 商學院 > 統計學系 > 學位論文 >  Item 140.119/95119
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/95119


    Title: 遞迴支持向量迴歸資料縮減法
    Recursive SVR data reduction
    Authors: 江政舉
    Contributors: 薛慧敏
    江政舉
    Keywords: 支持向量機
    支持向量迴歸
    資料縮減
    Date: 2009
    Issue Date: 2016-05-09 15:11:26 (UTC+8)
    Abstract: 近年來,支持向量機(SVM, Support Vector Machine)及支持向量迴歸(SVR, Support Vector Regression)已被廣泛的應用在分類及預測上的問題,然而實務上常見資料過於龐大,而導致需要較長的計算時間及較高的計算成本。為了解決這樣的問題,Zhang等人(2006)及Chen, Wang與Cao(2008)發展兩種類型的資料縮減方法。前者為減少變數數量的遞迴支持向量機(RSVM, Recursive Support Vector Machine),藉由交叉驗證以及定義所謂的貢獻因子來找出重要的變數,而考慮僅利用重要的變數做分類。後者的方法稱為DSKR(Direct Sparse Kernel Regression),考慮在支持向量迴歸中,僅選取部份支持向量個數做預測,以達到資料縮減效果。本研究將遞迴支持向量機的方法延伸至支持向量迴歸上,此法稱為遞迴支持向量迴歸(RSVR, Recursive Support Vector Regression),藉由交叉驗證以及依據決策函數來定義各變數的貢獻因子,藉此選取出重要的變數,並且保留這些重要變數來做後續分析與預測。本研究將此方法應用於兩組實際的化學資料:Triazines及Pyrim,我們發現資料被大幅縮減,僅有六分之一至五分之一的變數被保留。而資料縮減後的預測效果,與利用整組原始資料來進行支持向量迴歸的結果相近,但較DSKR的結果差。

    關鍵字:支持向量機,支持向量迴歸,資料縮減
    Reference: 一、 中文文獻

    1. 呂奇傑,李天行,高人龍,黃敏菁(2009),「支援向量機及支援向量迴歸於財務時間序列預測之應用」。

    二、 英文文獻

    1. Chen, X. F. , Wang, S. T., Cao, S.Q(2008)“DSKR: Building Sparse Kernel
    Regression Directly ”,Journal of Applied Sciences, 8, 3407-3414.
    2. Vapnik, V. (1995) The Nature of Statistical Learning Theory, Springer, NY.
    3. Vapnik, V. (1998) Statistical Learning Theory, John Wiley and Sons, NY.
    4. Ross D. K, UCI Machine Learning Repository,
    http://www.ics.uci.edu/~mlearn/MLRepository.html . Irvine , CA: University of California, School of Information and Computer Science.
    5. Smola, A. J. and Schölkopf, B.(2004) “A Tutorial on Support Vector Regression” Statistics and Computing, 14, 199-204.
    6. Smola, A. J., Schölkopf, B. and Williamson, R. C.(2008) “New Support Vector Algorithms,” Neural Computation, 12, 1207-1245.
    7. Wikipedia, Mercer`s theorem, http://en.wikipedia.org/wiki/Mercer`s_theorem (as of June 21, 2009) .
    8. Wu, M., Schölkopf, B. and Gökhan, B. (2006)“A Direct Method for Building Sparse Kernel Learning Algorithms”, Journal of Machine Learning Research, 7 603–624
    9. Zhang, X. G. , Lu, X., Shi, Q., Xu, X. Q., Leung, H.C. , Harris, L. N. , Iglehart, J. D., Miron, A., Liu, J. S. and Wong, W. H.(2006)”Recursive SVM Feature Selection and Sample Classification for Mass-Spectrometry and Microarray Data”, BMC Bioinformatics, 7:197.
    Description: 碩士
    國立政治大學
    統計學系
    96354014
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0096354014
    Data Type: thesis
    Appears in Collections:[統計學系] 學位論文

    Files in This Item:

    File SizeFormat
    index.html0KbHTML2240View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback