Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/111304
|
Title: | 機器學習分類方法DCG 與其他方法比較(以紅酒為例) A supervised learning study of comparison between DCG tree and other machine learning methods in a wine quality dataset |
Authors: | 楊俊隆 Yang, Jiun Lung |
Contributors: | 周珮婷 Chou, Pei Ting 楊俊隆 Yang, Jiun Lung |
Keywords: | 監督式學習 非監督式學習 加權資料雲幾何樹 Supervised learning Unsupervised learning WDCG |
Date: | 2017 |
Issue Date: | 2017-07-24 11:58:59 (UTC+8) |
Abstract: | 隨著大數據時代來臨,機器學習方法已然成為熱門學習的主題,主要分為監督式學習與非監督式學習,亦即分類與分群。本研究以羅吉斯迴歸配適結果加權距離矩陣,以資料雲幾何樹分群法為主,在含有類別變數的紅酒資料中,透過先分群再分類的方式,判斷是否可以得到更佳的預測結果。並比較監督式學習下各種機器學習方法預測表現,及非監督式學習下後再透過分類器方法的預測表現。在內容的排序上,首先介紹常見的分類與分群演算方法,並分析其優缺點與假設限制,接著將介紹資料雲幾何樹演算法,並詳述執行步驟。最後再引入加權資料雲幾何樹演算法,將權重的觀點應用在資料雲幾何樹演算法中,透過紅酒資料,比較各種分類與分群方法的預測準確率。 Machine learning has become a popular topic since the coming of big data era. Machine learning algorithms are often categorized as being supervised or unsupervised, namely classification or clustering methods. In this study, first, we introduced the advantages, disadvantages, and limits of traditional classification and clustering algorithms. Next, we introduced DCG-tree and WDCG algorithms. We extended the idea of WDCG to the cases with label size=3. The distance matrix was modified by the fitted results of logistic regression. Lastly, by using a real wine dataset, we then compared the performance of WDCG with the performance of traditional classification methodologies. The study showed that using unsupervised learning algorithm with logistic regression as a classifier performs better than using only the traditional classification methods. |
Reference: | Allwein, E. L., Schapire, R. E., & Singer, Y. (2000). Reducing multiclass to binary: A unifying approach for margin classifiers. Journal of machine learning research, 1(Dec), 113-141. Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. In Proceedings of the fifth annual workshop on Computational learning theory (pp. 144-152). ACM. Chakraborty, S. (2005). Bayesian machine learning. University of Florida. Chou, E. P., Hsieh, F., & Capitanio, J. (2013). Computed Data-Geometry Based Supervised and Semi-supervised Learning in High Dimensional Data. In Machine Learning and Applications (ICMLA), 2013 12th International Conference on (Vol. 1, pp. 277-282). IEEE. Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine learning, 20(3), 273-297. Cortez, P., Cerdeira, A., Almeida, F., Matos, T., & Reis, J. (2009). Modeling wine preferences by data mining from physicochemical properties. Decision Support Systems, 47(4), 547-553. Dietterich, T. G. (1997). Machine-learning research. AI magazine, 18(4), 97. Filzmoser, P., Baumgartner, R., & Moser, E. (1999). A hierarchical clustering method for analyzing functional MR images. Magnetic resonance imaging, 17(6), 817-826. Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems. Annals of human genetics, 7(2), 179-188. Fushing, H., & McAssey, M. P. (2010). Time, temperature, and data cloud geometry. Physical Review E, 82(6), 061110. Fushing, H., Wang, H., VanderWaal, K., McCowan, B., & Koehl, P. (2013). Multi-scale clustering by building a robust and self correcting ultrametric topology on data points. PloS one, 8(2), e56259. Hartigan, J. A., & Wong, M. A. (1979). Algorithm AS 136: A k-means clustering algorithm. Journal of the Royal Statistical Society. Series C (Applied Statistics), 28(1), 100-108. Hastie, T., & Tibshirani, R. (1998). Classification by pairwise coupling. In Advances in neural information processing systems (pp. 507-513). Johnson, S. C. (1967). Hierarchical clustering schemes. Psychometrika, 32(3), 241-254. Kotsiantis, S. B., Zaharakis, I. D., & Pintelas, P. E. (2006). Machine learning: a review of classification and combining techniques. Artificial Intelligence Review, 26(3), 159-190. Peng, C. Y. J., Lee, K. L., & Ingersoll, G. M. (2002). An introduction to logistic regression analysis and reporting. The journal of educational research, 96(1), 3-14. Pereira, F., Mitchell, T., & Botvinick, M. (2009). Machine learning classifiers and fMRI: a tutorial overview. Neuroimage, 45(1), S199-S209. Sharan, R. V., & Moir, T. J. (2014). Comparison of multiclass SVM classification techniques in an audio surveillance application under mismatched conditions. In Digital Signal Processing (DSP), 2014 19th International Conference on (pp. 83-88). IEEE. |
Description: | 碩士 國立政治大學 統計學系 102354015 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0102354015 |
Data Type: | thesis |
Appears in Collections: | [統計學系] 學位論文
|
Files in This Item:
File |
Size | Format | |
401501.pdf | 857Kb | Adobe PDF2 | 911 | View/Open |
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|