Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/146786
|
Title: | 稀疏降維與規則學習應用於PCB電路板設計品質評估 Sparse Dimension Reduction and Rule Induction for Evaluating PCB Design Quality |
Authors: | 陳昱璇 Chen, Yu-Hsuan |
Contributors: | 莊皓鈞 周彥君 陳昱璇 Chen, Yu-Hsuan |
Keywords: | 印刷電路板 稀疏主成分分析 集成樹學習 Printed circuit boards Sparse PCA Ensemble tree learning |
Date: | 2023 |
Issue Date: | 2023-08-16 13:31:10 (UTC+8) |
Abstract: | 在Printed circuit boards(PCB)的供應鏈中,為了確保品質,在研發PCB的過程中繁複的測試與除錯經常花費許多的人力與時間,本研究取得了業界領先的半導體業A公司的前後兩階段PCB測試資料,由於這兩種測試的結果有一定程度的關聯,因此本研究希望藉由第一階段硬體線路設計測試的表現預測第二階段訊號完整性測試的結果,建構出一個流程能夠提前偵測錯誤,並藉由可解釋的模型協助工程師更快速找出PCB的設計缺失,達到節省人力與時間的效果。
由於第一階段測試中的變數具有高度共線性的現象,因此為了降低變數間彼此的關聯,本文比較了四種降維的方法(包含PCA、Sparse PCA、Robust PCA與Robust Sparse PCA),其中Sparse PCA與Robust Sparse PCA能夠讓降維後產生的主成分有較佳的解釋力,且分析結果顯示使用Robust Sparse PCA降維的表現是優於其他方法的,與傳統的PCA方法相比在最好的情況能夠使預測的誤差優化60%,在降低第二階段結果預測誤差的同時能夠讓降維後提取出的主成分能夠容易地被工程師解讀;接下來萃取出的主成分會作為 Ensemble tree learning模型的輸入來訓練資料,在此處本研究選擇RuleFit與XGBoost模型來實作,Rule ensemble能夠在資料中找出具有代表性的規則,另外最後也使用XGBoost來進行預測,搭配Shapley value衡量每個變數對預測值的貢獻,讓工程師發現預測的結果沒有達到設定的標準時,能夠藉由規則與Shapley value的輔助來找出問題的原因,讓整體測試與除錯流程中的時間與人力花費能夠有效降低。 In the Printed circuit boards(PCB) supply chain, to ensure quality, there’re plenty of complicated tests takes a lot of manpower and time in the process of developing PCB. This study obtained the PCB test data of the industry-leading semiconductor. The dataset includes PCB hardware circuit test result(phase one) and signal test result(phase two). Since the results of these two tests are correlated to a certain extent, this study aims to predict the results of phase two test through the performance of the phase one test and construct a process that can detect errors in advance save time and manpower.
Since the variables in the phase one test are highly correlated, to reduce the correlation between variables, this study compares several dimension reduction methods (including PCA, Sparse PCA, Robust PCA, and Robust Sparse PCA), among which Sparse PCA and Robust Sparse PCA can make the principal components more explainable. The result shows that the performance of dimensionality reduction using Robust Sparse PCA is better than other methods, compared with the classical PCA method, it can make prediction error optimized by 38%. While the method can effectively reduce the prediction error, the principal components extracted can be easily interpreted by engineers. Then the principal components will be used as the input of the ensemble tree learning model. this study implements the rule ensemble model and XGBoost model. Ensemble model can finds representative rules in the data. Therefore, we use XGBoost to make predictions and Shapley value will be matched to explain the model. When the engineers find that the predicted result does not meet the standard, they can identify the cause of the problem through the rules, so that the debugging time and labor costs in the overall test can be effectively reduced. |
Reference: | Efron, B., Hastie, T., Johnstone, I., & Tibshirani, R. (2004). Least Angle Regression. The Annals of Statistics, 32(2), 407–451.
Brunton, S., & Kutz, J. (2019). Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control. Cambridge: Cambridge University Press.
Fokkema, M. (2020). Fitting Prediction Rule Ensembles with R Package pre. Journal of Statistical Software, 92(12), 1–30.
Fokkema, M., & Strobl, C. (2020). Fitting prediction rule ensembles to psychological research data: An introduction and tutorial. Psychological Methods, 25, 636-652.
Friedman, J. H., & Popescu, B. E. (2008). Predictive Learning via Rule Ensembles. The Annals of Applied Statistics, 2(3), 916-954.
Guerra-Urzola, R., Van Deun, K., Vera, J. C., & Sijtsma, K. (2021). A Guide for Sparse PCA: Model Comparison and Applications. Psychometrika, 86(4), 893-919.
Jerome, F., Trevor, H., Holger, H., & Robert, T. (2007). Pathwise coordinate optimization. The Annals of Applied Statistics, 1(2), 302-332.
Senoner, J., Netland, T., & Feuerriegel, S. (2022). Using explainable artificial intelligence to improve process quality: Evidence from semiconductor manufacturing. Management Science, 68(8), 5704-5723.
J. Shlens, "A tutorial on principal component analysis," arXiv preprint arXiv:1404.1100, 2014.
Chen, T.; Guestrin, C. Xgboost: A scalable tree boosting system. In Proceedings of the 22nd ACM Sigkdd International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794.
Zou, H., & Hastie, T. (2005). Regularization and Variable Selection via the Elastic Net. Journal of the Royal Statistical Society. Series B (Statistical Methodology), 67(2), 301-320.
Zou, H., Hastie, T., & Tibshirani, R. (2006). Sparse Principal Component Analysis. Journal of Computational and Graphical Statistics, 15(2), 265-286. |
Description: | 碩士 國立政治大學 資訊管理學系 109356039 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0109356039 |
Data Type: | thesis |
Appears in Collections: | [資訊管理學系] 學位論文
|
Files in This Item:
File |
Description |
Size | Format | |
603901.pdf | | 2045Kb | Adobe PDF2 | 0 | View/Open |
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|