English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 113311/144292 (79%)
Visitors : 50938629      Online Users : 1005
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 商學院 > 資訊管理學系 > 學位論文 >  Item 140.119/31101
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/31101


    Title: 決策樹形式知識整合之研究 The Research on Decision-Tree-Based Knowledge Integration
    Authors: 馬芳資
    Ma, Fang-tz
    Contributors: 林我聰
    Lin, Woo-Tsong
    馬芳資
    Ma, Fang-tz
    Keywords: 知識整合
    決策樹
    決策樹合併
    決策樹修剪
    Knowledge Integration
    Decision Tree
    Decision Tree Merging
    Decision Tree Pruning
    Date: 2004
    Issue Date: 2009-09-14 09:15:18 (UTC+8)
    Abstract: 隨著知識經濟時代的來臨,掌握知識可幫助組織提昇其競爭力,因此對於知識的產生、儲存、應用和整合,已成為熱烈討論的議題,本研究針對知識整合議題進行探討;而在知識呈現方式中,決策樹(Decision Tree)形式知識為樹狀結構,可以用圖形化的方式來呈現,它的結構簡單且易於瞭解,本研究針對決策樹形式知識來探討其知識整合的課題。
    本研究首先提出一個合併選擇決策樹方法MODT(Merging Optional Decision Tree),主要是在原始決策樹結構中增加一個選擇連結(Option Link),來結合具有相同祖先(Ancestor)的兩個子樹;而結合方式是以兩兩合併的方式,由上而下比對兩棵決策樹的節點(Node),利用接枝(Grafting)技術來結合兩棵樹的知識。再者利用強態法則(Strong Pattern Rule)概念來提昇合併樹的預測能力。
    其次,由於MODT方法在合併兩棵不同根節點的決策樹時,會形成環狀連結的情形而破壞了原有的樹形結構,以及新增的選擇連結會增加儲存空間且不易維護,因此本研究提出決策樹合併修剪方法DTBMPA(Decision-Tree-Based Merging-Pruning Approach)方法,來改善MODT方法的問題,並且增加修剪程序來簡化合併樹。此方法包括三個主要程序:決策樹合併、合併樹修剪和決策樹驗證。其做法是先將兩棵原始樹經由合併程序結合成一棵合併樹,再透過修剪程序產生修剪樹,最後由驗證程序來評估修剪樹的準確度。本研究提出的DTBMPA方法藉由合併程序來擴大樹的知識,再利用修剪程序來取得更精簡的合併樹。
    本研究利用實際信用卡客戶的信用資料來進行驗證。在MODT方法的實驗上,合併樹的準確度同時大於或等於兩棵原始樹的比例為79.5%;並且針對兩者的準確度進行統計檢定,我們發現合併樹的準確度是有顯著大於原始樹。而在DTBMPA方法的實驗中,合併樹的準確度優於原始一棵樹的比率有90%,而修剪樹的準確度大於或等於合併樹的比率有80%。在統計檢定中,合併樹和修剪樹的準確度優於一棵樹的準確度達顯著差異。且修剪樹的節點數較合併樹的節點數平均減少約15%。綜合上述,本研究所提之MODT方法和DTBMPA方法皆能使得合併樹的準確度優於一棵樹的準確度,而其中DTBMPA方法可以取得更精簡的合併樹。
    此外,就決策樹形式知識整合的應用而言,本研究提出一個決策樹形式知識發掘預測系統架構,其主要的目在於提供一個Web-Based的知識發掘預測系統,以輔助企業進行知識學習、知識儲存、知識整合、知識流通和知識應用等知識管理的功能。期能藉由使用這套系統來發掘企業內部隱含的重要知識,並運用此發掘的知識進行分類和預測工作。它包含三個主要子系統,即知識學習子系統、合併決策樹子系統和線上預測子系統,其中合併決策樹子系統就是應用本研究所提出之決策樹形式知識整合方法來進行知識整合處理。
    有關後續研究方面,可針對下列議題進行研究:
    一、就決策樹形式知識整合架構中,探討決策樹形式知識清理單元,即前置處理部份的功能設計,期能讓合併樹結合有一定質量的決策樹形式知識。
    二、就綜合多個預測值部份,可加入模糊邏輯理論,處理判定結果值之灰色地帶,以提昇合併樹的預測準確度。
    三、就決策樹本身而言,可進一步探討結合選取多個屬性來進行往下分群的決策樹。針對分類性屬性的分支數目不同或可能值不同時的合併處理方法;以及數值性屬性選取不同的分割點時的合併處理方法。
    四、探討分類性屬性的分支數目不同或可能值不同時之合併處理方法,以及數值性屬性選取不同的分割點時之合併處理方法。
    五、對於合併樹的修剪方法,可考量利用額外修剪例子集來進行修剪的處理方法,並比較不同修剪法之修剪效果及準確度評估。
    六、探討多次合併修剪後的決策樹之重整課題,期能藉由調整樹形結構來提昇其使用時的運作效率,且期能讓合併樹順應環境變化而進行其知識調整,並進一步觀察合併樹的樹形結構之變化情形。
    七、就實際應用而言,可與廠商合作來建置決策樹形式知識發掘預測系統,配合該廠商的產業特性及業務需求來設計此系統,並導入此系統於企業內部的營運,期能藉此累積該企業的知識且輔助管理者決策的制定。
    In the knowledge economy era, mastering knowledge can improve organization competitive abilities. Therefore, knowledge creation, retention, application, and integration are becoming the hottest themes for discussion nowadays.
    Our research focuses on the discussion of knowledge integration and related subjects. Decision trees are one of the most common methods of knowledge representation. They show knowledge structure in a tree-shaped graph. Decision trees are simple and easily understood; thus we focus on decision-tree-based knowledge in connection with the theme of knowledge integration.
    First, this research proposes a method called MODT (Merging Optional Decision Tree), which merges two knowledge trees at once and adds an optional link to merge nodes which have the same ancestor. In MODT, we compare the corresponding nodes of two trees using the top-down traversal method. When their nodes are the same, we recount the number of samples and recalculate the degree of purity. When their nodes are not the same, we add the node of the second tree and its descendants to the first tree by the grafting technique. This yields a completely merged decision tree. The Strong Pattern Rule is used to strengthen the forecast accuracy during the merged decision trees.
    Secondly, when we use the MODT method to merge two trees which have different roots, the merged tree has cyclic link in the root. It makes the merged tree not a tree structure, so we propose another approach called DTBMPA (Decision-Tree-Based Merging-Pruning Approach) to solve this problem. There are three steps in this approach. In the merging step, the first step, two primitive decision trees are merged as a merged tree to enlarge the knowledge of primitive trees. In the pruning step, the second step, the merged tree from the first step is pruned as a pruned tree to cut off the bias branches of the merged tree. In the validating step, the last step, the performance of the pruned tree from the second step is validated.
    We took real credit card user data as our sample data. In the MODT experiments, the merged trees showed a 79.5% chance of being equal or more accurate than the primitive trees. This research result supports our proposition that the merged decision tree method could achieve a better outcome with regard to knowledge integration and accumulation. In the DTBMPA simulation experiments, the percentage accuracy for the merged tree will have 90% of chance that is greater than or equal to the accuracy for those primitive trees, and the percentage accuracy for the pruned tree will have 80% of chance that is greater than or equal to the accuracy for merged tree. And we also find that the average number of nodes of the pruned tree will have 15% less than that of the merged tree. Eventually, our MODT and DTBMPA methods can improve the accuracy of the merged tree, and the DTBMPA method can produced smaller merged tree.
    Finally, in respect to the application of the Decision-Tree-Based Knowledge Integration, this research proposes an on-line Decision-Tree-Based knowledge discovery and predictive system architecture. It can aid businesses to discover their knowledge, store their knowledge, integrate their knowledge, and apply their knowledge to make decisions. It contains three components, including knowledge learning system, decision-tree merging system, and on-line predictive system. And we use the DTBMPA method to design the decision-tree merging system. Future directions of research are as follows.
    1.Discussing the Decision-Tree preprocessing process in our Decision-Tree-Based Knowledge Integration Architecture.
    2.Using the fuzzy theory to improve the accuracy of the merged tree when combining multiple predictions.
    3.Discussing the merge of the complicated decision trees which are model trees, linear decision trees, oblique decision trees, regression trees, or fuzzy trees.
    4.Discussing the process to merge two trees which have different possible values of non-numeric attributes or have different cut points of numeric attributes.
    5.Comparing the performance of other pruning methods with ours.
    6.Discussing the reconstruction of the merged trees after merging many new trees, discussing the adaptation of the merged trees to the changing environment, and observation of the evolution of the merged trees which are produced in different time stamp
    7.Implementation of the on-line Decision-Tree-Based knowledge discovery in a real business environment.
    Reference: (一)中文部份
    1.方維,演算法與資料結構,維科出版社,台北市,1994年。
    2.吳育儒,決策樹中移除不相關值問題的研究,淡江大學資訊工程系碩士論文,1998年。
    3.林傑斌,劉明德,資料採掘-與OLAP理論與實務,文魁圖書公司,2002年。
    4.徐芳玲,以主成分分析應用在決策樹名目屬性之二元分割上,國立成功大學資訊管理研究所碩士論文,2002年。
    5.馬芳資,林我聰,決策樹形式知識合併修剪之研究,電子商務研究,已接受未出刊,2005a年。
    6.馬芳資,林我聰,Web-based 決策樹知識發掘預測系統架構,ICTA2005技術與認證國際學術研討會,2005b年6月,pp.35-49。
    7.馬芳資,林我聰,決策樹形式知識整合之研究,資訊管理學報,已接受未出刊,2005c年。
    8.馬芳資,林我聰,決策樹形式知識之線上預測系統架構,圖書館學與資訊科學,Vol. 29,No.2,2003年10月,pp.60-76。
    9.馬芳資,信用卡信用風險預警範例學習系統之研究,第十屆全國技職及職業教育研討會,技職研討會,商業類I,1995年,pp.427-436。
    10.馬芳資,信用卡信用風險預警範例學習系統之研究,國立政治大學資訊管理系碩士論文,1994年。
    11.陳重銘,結合直線最適法於決策樹修剪之影響研究,國立中山大學資訊管理研究所碩士論文,1995年。
    12.陳偉,決策樹中不相關的條件值問題之探討,淡江大學資訊工程學系博士論文,1999年。
    13.彭文正譯,Data Mining資料採礦理論與實務-顧客關係管理的技巧與科學,維科圖書,台北市,2001年。
    14.曾憲雄,黃國禎,人工智慧與專家系統-理論╱實務╱應用,旗標出版股份公司,台北市,2005年,pp.1-25。
    15.曾憲雄,蔡秀滿,蘇東興,曾秋蓉,王慶堯,資料探勘-Data Mining,旗標出版股份公司,台北市,2005年。
    16.楊建民,在微平行電腦上發展範例學習系統研究信用卡信用風評估,行政院國家科學委員會專題研究計畫,1993年7月。
    17.楊建民,專家系統與機器學習:財務專家系統知識庫建構與學習之研究,台北,時英出版社,1991年3月。
    18.蔣以仁,資料發掘之模糊分類,國立臺灣大學資訊工程學系博士論文,1997年。
    19.謝孟錫,分徑指標在建立決策樹的比較,國立中央大學工業管理研究所碩士論文,2002年。
    20.謝國義,決策樹形成過程中計算複雜度之改善研究,國立成功大學工業管理學系碩士論文,1998年。
    (二)英文部份
    1.Auer, P., Holte, R.C. & Maass, W. “Theory and application of agnostic PAC-learning with small decision trees,” Proceedings of the twelfth international conference on machine learning, 1995, pp.21-29.
    2.Bauer, E. & Kohavi, R., “An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants,” Journal of Machine Learning Vol. 36, Nos. 1/2, July/August 1999, pp.105-139.
    3.Berry, M. J. A. & Linoff, G.. S., Data Mining Techniques- For Marketing, Sales, and Customer Relationship Management (Second Edition), Wiley, 2004。
    4.Bolakova, I., “Pruning Decision Trees to Reduce Tree Size,” Proceedings of the international conference--Traditional And Innovations In Sustainable Development Of Society, Rezekne, Latvia, February 28 - March 2002, pp.160-166, ISBN 9984-585-02-6.
    5.Bradford, J. & Kunz, C. & Kohavi, R. & Brunk, C. & Brodley, C.E., “Pruning decision trees with misclassification costs,” In Proceedings of Tenth European Conference on Machine Learning(ECML-98), Berlin, 1998, pp. 131-136.
    6.Breiman, L., “Bagging predictors,” Machine Learning 24, 1996, pp.123-140.
    7.Breiman, L. Friedman, J.H. Olshen, R. & Stone, C., Classification and Regression Trees, Belmont, California: Wadsworth, 1984.
    8.Breslow, L. A. & Aha, D. W., “Comparing simplification procedures for decision trees,” Artificial Intelligence and Statistics, 5, 1998, pp.199-206.
    9.Brodley, C.E. & Utgoff, P.E., “Multivariate decision trees,” Machine Learning, 19, 1995, pp.45-77.
    10.Butine, W., “Learning classification trees,” Statistics and Computing 2(2), 1992, pp.63-73.
    11.Chan, P. K. & Stolfo, S. J., “On the Accuracy of Meta-learning for Scalable Data Mining,” Journal of Intelligent Integration of Information, L. Kerschberg, Ed., 1998.
    12.Chan, P. K. & Stolfo, S. J., “A comparative evaluation of voting and meta-learning on partitioned data,” In Proceedings of the 12th International Conference on Machine Learning (ICML-95), 1995a, pp:90-98, Morgan Kaufmann.
    13.Chan, P.K. & Stolfo, S.J., “Learning arbiter and combiner trees from partitioned data for scaling machine learning,” In Proc. Intl. Conf. on Knowledge Discovery and Data Mining, 1995b, pp.39-44.
    14.Chen, K. & Wang, L., & Chi, H., “Methods of Combining Multiple Classifiers with Different Features and Their Applications to Text-Independent Speaker Identification,” International Journal of Pattern Recognition and Artificial Intelligence, 11(3), 1997, pp.417-445.
    15.Cherkauer, K.J. & Shavlik, J.W. “Growing simpler decision trees to facilitate knowledge discovery,” Procceedings of the Second International Conference on Knowledge Discovery and Data Mining, 1996, pp.315-318.
    16.Cormen, T.H. & Leiserson, C.E. & Rivest, R.L., & Stein, C., Introduction to Algorithms (Second Edition), McGraw-Hill, 2001.
    17.DMG, The Data Mining Group, http://www.dmg.org, 2005.
    18.Dong M. & Kothari, R., “Classifiability Based Pruning of Decision Trees,” Proc. International Joint Conference on Neural Networks (IJCNN), Volume 3, 2001, pp.1739-1743.
    19.Dunham, M.H., Data Ming: Introductory and Advanced Topics, Pearson Education, Inc., 2003.
    20.Esposito, F., Malerba, D. & Semeraro, G.., “A Further Study of Pruning Methods in Decision Tree Induction,” Proceedings of the Fifth International Workshop on Artificial Intelligence and Statistics, 1995, pp.211-218.
    21.Fayyad, U. M. et al., “Data Mining and Knowledge Discovery,” Kluwer Academic Publishers, 1997.
    22.Fayyad, U. M., “Data Mining and Knowledge Discovery: Making Sense out of Data,” IEEE Expert, Vol.11, No. 5, October 1996, pp.20-25.
    23.Fournier D. & Crémilleux B., “A Quality Index for Decision Tree Pruning,” Knowledge-Based Systems, Volume 15, 2002, Elsevier, pp.37-43.
    24.Frank, E., Hall, M., Trigg, L., Holmes, G. and Witten, I.H., “Data mining in bioinformatics using Weka,” Bioinformatics Advance Access, published online Bioinformatics, April 8 2004. Oxford University Press.
    25.Frank, E., Pruning Decision Trees and Lists, Department of Computer Science, University of Waikato, Hamilton, New Zealand., 2000.
    26.Freund, Y & Schapire, R. E., “A decision-theoretic generalization of on-line learning and an application to boosting,” Journal of Computer and System Sciences, 55, 1997, pp.119-139.
    27.Gama, J., “Probabilistic Linear Tree,” Proc. 14th International Conference on Machine Learning, 1997.
    28.Hall, L.O. & Chawla, N. & Bowyer, K.W., “Combining decision trees learned in parallel,” In Working Notes of the KDD-97 Workshop on Distributed Data Mining, 1998, pp.10-15.
    29.Holmes, G., & Kirkby, R. & Pfahringer, B., “Mining data streams using option trees,” Working Paper 03, Department of Computer Science, The University of Waikato, Hamilton, 2004.
    30.John, G., “Robust Decision Trees: Removing Outliers in Databases,” Proceedings of the First International Conference on Knowledge Discovery and Data Mining, 1995, pp.174-179.
    31.Kargupta, H., Hamzaoglu, I., Stafford, B., Hanagandi, V., & Buescher, K., “PADMA: Parallel Data Mining Agent for Scalable Text Classification,” In Proceedings Conference on High Performance Computing 1997, pp.290-295. The Society for Computer Simulation International.
    32.Kohavi, R. & Quinlan, J.R., “Decision-tree discovery,” In Will Klosgen and Jan M. Zytkow, editors, Handbook of Data Mining and Knowledge Discovery, chapter 16.1.3, 2002, pp.267-276. Oxford University Press.
    33.Kohavi, R. & Kunz, C., “Option Decision Trees with Majority Votes,” Machine Learning: Proceedings of the Fourteenth International Conference, 1997.
    34.Krishnaswamy, S., Zaslavsky, A., & Loke, S.W., “Federated Data Mining Services and a Supporting XML Markup Language,” Proceedings of the 34th Annual Hawaii International Conference on System Sciences (HICSS-34), Hawaii, USA, January 2001. In the "e-Services: Models and Methods for Design, Implementation and Delivery" mini-track of the "Decision Technologies for Management" track, IEEE Press, ISBN 0-7695-0981-9.
    35.Krishnaswamy, S., Zaslavsky, A. & Loke, S.W., “An Architecture to Support Distributed Data Mining Services in E-Commerce Environments,” Proceedings of the Second International Workshop on Advanced Issues in E-Commerce and Web-Based Information Systems, San Jose, California, June 8-9 2000, pp.238-246.
    36.Mingers, J., “An Empirical Comparision of Selection Measures for Decision-Tree Induction,” Machine Learning, 3, 1989a, pp.319-342.
    37.Mingers, J., “An empirical comparison of pruning methods for decision tree induction,” Machine Learning, Volume 4, 1989b, pp.227-443.
    38.Mingers, J., “Expert systems-rule induction with statistical data,” Journal of the Operational Research Society, 38, 1987, pp.39-47.
    39.Murphy, O. J. & McCraw, R. L., “Designing Storage Efficient Decision Trees,” IEEE Transactions on Computers, 40(3), 1991, pp.315-319.
    40.Murphy, S. K., “Automatic Construction of Decision Trees from Data: a multi-disciplinary survey,” Data Mining and Knowledge Discovery, 2, 1998, pp.345-389.
    41.Murphy, S. K., On growing better decision trees from data, Doctoral dissertation, University of Maryland, 1997.
    42.Niblett, T. & Bratko, I., “Learning decision rules in noisy domain,” In Bramer, M. A. (Ed.), Research and Development in Expert Systems III. Proceedings of Expert Systems 1986, Brighton, pp.413-420. Madison, Wisconsin: Morgan Kaufmann, San Francisco, CA.
    43.Pagallo, G. & Haussler, D. “Boolean feature discovery in empirical learning,” Machine Learning, 5, 1990, pp.71-100.
    44.PMML 3.0 – Predictive Model Markup Language. http://www.dmg.org/pmml-v3-0.html, 2005.
    45.Prodromidis, A.L. & Stolfo, S.J., “Mining databases with different schemas: Integrating incompatible classifiers,” Proc. KDD-98, August 1998.
    46.Quinlan, J. R., “MiniBoosting Decision Trees,” Journal of Artificial Intelligence Research, 1998.
    47.Quinlan, J.R., “Bagging, Boosting, and C4.5,” In Proceedings Thirteenth National Conference on Artificial Intelligence, 1996a, pp.725-730, AAAI Press.
    48.Quinlan, J.R., “Improved use of continuous attributes in C4.5,” Journal of Artificial Intelligence Research, 4, 1996b, pp.77-90.
    49.Quinlan, J.R., C4.5: Programs for Machine Learning, San Mateo: Morgan Kaufmann, 1993.
    50.Quinlan, J.R., “Simplifying decision trees,” International Journal of Man-Machine Studies, 27(3), 1987, pp.221-234.
    51.Quinlan, J.R., Machine Learning: An Artificial Intelligence Approach, Volume 2, chapter The effect of noise on concept learning, Los Altos, CA: Morgan Kaufmann, 1986, pp.149-166.
    52.Ragavan, H. & Rendell, L. “Lookahead feagure construction for learning hard concepts,” Proceeding of the Tenth International Conference on Machine Learning, 1993, pp.252-259.
    53.Smyth, P. & Gray, A. & Fayyad, U., “Retrofitting decision tree classifiers using kernel density estimation,” In Proceedings of the Twelfth International Conference on 844 Machine Learning, 1995, pp:506-514, Morgan Kaufmann Publishers.
    54.Ting, K.M. & Low, B.T., “Model combination in the multiple-data-batched scenario,” Proc. European Conference on Machine Learning, Prague, Czech Republic, LNAI-1224, 1997, pp.250-265, Springer-Verlag.
    55.Ting, K.M. & Witten, I.H., “Stacking bagged and dagged models,” Proc International Conference on Machine Learning, Tennessee, 1997, pp.367-375.
    56.Ting, K.M. & Low, B.T., “Theory combination: an alternative to data combination,” Working Paper 1996/19, Department of Computer Science, University of Waikato.
    57.Todorovski, L., & Dzeroski, S., “Combining Classifiers with Meta Decision Trees,” Machine Learning Journal. 50(3): pp.223-249; Mar 2003.
    58.Todorovski, L. & Dzeroski, S., “Combining Multiple Models with Meta Decision Trees,” In Proceedings of the Fourth European Conference on Principles of Data Mining and Knowledge Discovery, Springer 2000, pp.54-64.
    59.Todorovski, L. & Dzeroski, S., “Experiments in meta-level learning with ILP,” In Proceedings of the Third European Conference on Principles of Data Mining and Knowledge Discovery. Springer 1999, pp.98-106.
    60.Utgoff, P.E. & Clouse, J.A., “A Kolmogorov-Smirnoff metric for decision tree induction,” Technical Report 96-3, Department of Computer Science, University of Massachusetts, 1996.
    61.Utgoff, P.E., “Decision tree induction based on efficient tree restructuring,” Technical Report 95-18, Department of Computer Science, University of Massachusetts, 1996.
    62.Webb, G.I., “Decision Tree Grafting From The All Tests But One Partition,” In Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence (IJCAI 99). Morgan Kaufmann, 1999.
    63.Webb, G.I., “Further experimental evidence against the utility of occam’s razor,” Journal of Artificial Intelligence Research, 4, 1996, pp.397-417.
    64.Williams, G.., Induction and Combining Multiple Decision Trees, Ph.D. Dissertation, Australian National University, Canberra, Australia, 1990.
    65.Windeatt T. & Ardeshir G.., “An empirical comparison of pruning methods for ensemble classifiers,” Proc. of Int. Conf Intelligent Data Analysis, Sep. 13-15 2001, Lisbon, Portugal, Lecture notes in computer science, Springer-Verlag, pp.208-217.
    66.Witten, I.H. & Frank, E., Data Mining: Practical Machine Learning Tools and Techniques with JAVA Implementations, Morgan Kaufmann, 2000.
    67.Wolpert, D.H., “Stacked generalization,” Neural Networks, 5, 1992, pp.241-259.
    68.Zheng, Z. & Webb, G. I., “Multiple Boosting: A Combination of Boosting and Bagging,” In Proceedings of the 1998 International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA 98), 1998, pp.1133-1140. CSREA Press.
    69.Zheng, Z., “Constructing nominal X-of-N attributes,” Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence, 1995, pp.1064-1070.
    70.Zhou, Z.H. & Chen, Z.Q., “Hybrid Decision Tree,” Knowledge-Based Systems, 15(8), 2002, pp.515-528.
    71.Zimenkov, A., Tree Classifiers, Department of Information Technology, Lappeenranta University of Technology, 2000.
    Description: 國立政治大學
    資訊管理研究所
    87356503
    93
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0873565031
    Data Type: thesis
    Appears in Collections:[資訊管理學系] 學位論文

    Files in This Item:

    File SizeFormat
    index.html0KbHTML2235View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback