English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 116849/147881 (79%)
Visitors : 63825510      Online Users : 673
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 商學院 > 統計學系 > 學位論文 >  Item 140.119/157808
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/157808


    Title: 應用主成分分析於 LSTM 孿生神經網路權重壓縮之效能評估
    Evaluating the Performance of PCA-Based Weight Compression for LSTM Siamese Neural Networks
    Authors: 樂沂晨
    Yueh, Yi-Chen
    Contributors: 周珮婷
    Chou, Elizabeth P.
    樂沂晨
    Yueh, Yi-Chen
    Keywords: 二元分類
    文本分析
    模型壓縮
    孿生神經網路
    長短期記憶模型
    主成分分析
    Binary Classification
    Text Analysis
    Model Compression
    Siamese Neural Network
    Long Short-Term Memory (LSTM)
    Principal Component Analysis(PCA)
    Date: 2025
    Issue Date: 2025-07-01 15:03:17 (UTC+8)
    Abstract: 孿生神經網路為一種監督式學習的神經網路,透過共享權重的雙子網路計算輸入對的相似度,廣泛應用於語意相似度判斷、人臉辨識與醫學影像比對等任務。儘管深度神經網路擁有較高的效能,實際應用時往往伴隨高參數量與龐大計算成本,特別是在資源受限的裝置上,若未進行適當優化,易面臨運算效能瓶頸。然而,神經網路雖展現出色的表現能力,其效能卻高度依賴於隱藏層中神經元數量的設定。神經元數量過多可能導致過擬合與運算資源浪費,過少則可能限制模型對特徵的學習能力。為此,本研究提出一種基於主成分分析(Principal Component Analysis, PCA)的方法,評估神經元輸出資訊的冗餘程度,進而輔助選擇適當的神經元數量。實驗結果顯示,此方法能在維持模型效能的同時,有效簡化模型結構,提供具參考價值的神經元配置建議。本研究建構一個基於長短期記憶模型的孿生神經網路架構,針對文本分析語意相似度後進行二分類。同時,為了兼顧模型效能、減少冗餘資訊並提升運行效率,採用主成分分析對 LSTM 各門控進行壓縮,探討不同降維程度對模型準確性與語意保留能力的影響,評估是否能在降低計算負擔的同時維持語意匹配效能,進而提升模型在實務應用中的可行性與推論效率。
    Siamese Neural Networks, supervised networks employing shared-weight twins, excel in tasks like semantic similarity and recognition. Despite deep networks' power, their high parameter count and computational cost pose challenges, especially on limited-resource devices. Performance also hinges on hidden layer neuron count; too many cause overfitting, too few limit learning. This study introduces a Principal Component Analysis (PCA)-based method to assess neuron output redundancy, aiding in optimal neuron selection. Experiments show this approach simplifies models while preserving performance, offering neuron configuration guidance. We build an LSTM-based Siamese network for binary semantic similarity classification. To balance performance, reduce redundancy, and enhance efficiency, we apply PCA to compress LSTM gates. In this study, we analyze how varying dimensionality reduction impacts accuracy and semantic retention, evaluating if computational reduction can maintain semantic matching, thus improving the model's practical applicability and inference efficiency.
    Reference: Abadi, M.,Barham, P.,Chen,J.,Chen,Z.,Davis,A.,Dean,J.,Devin,M.,Ghemawat,S.,Irving, G., Isard, M., Kudlur, M., Levenberg, J., Monga, R., Moore, S., Murray, D. G., Steiner, B., Tucker, P., Vasudevan, V., Warden, P., … Zheng, X. (2016). Tensorflow: A system for large-scale machine learning. https://arxiv.org/abs/1605.08695
    Alyozbaky, R., & Alanezi, M. (2023). Detection and analyzing phishing emails using nlp techniques. Detection and Analyzing Phishing Emails Using NLP Techniques, 1–6. https://doi.org/10.1109/HORA58378.2023.10156738
    Bahdanau, D., Cho, K., & Bengio, Y. (2014). Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473. https://arxiv.org/abs/1409.0473
    Bengio, Y., Frasconi, P., & Simard, P. (1993). The problem of learning long-term dependencies in recurrent networks. Proceedings of the IEEE International Conference on Neural Networks, 1183–1188.
    Bromley, J., Guyon, I., LeCun, Y., Säckinger, E., & Shah, R. (1993). Signature verification using a ”siamese” time delay neural network. In J. Cowan, G. Tesauro, & J. Alspector (Eds.), Advances in neural information processing systems (Vol. 6). Morgan-Kaufmann. https://proceedings.neurips.cc/paper_files/paper/1993/file/288cc0ff022877bd3df94bc9360b9c5d-Paper.pdf
    Choi, H.-S. (2024). Simple siamese model with long short-term memory for user authentication with field-programmable gate arrays. Electronics, 13(13). https://doi.org/10.3390/electronics13132584
    Costanti, F., Cappelli, I., Fort, A., Ceroni, E. G., & Bianchini, M. (2024). Lstm-based Siamese networks for fault detection in meteorological time series data. 2024 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE), 906–911. https://api.semanticscholar.org/CorpusID:275019516
    Doetsch, P., Kozielski, M., & Ney, H. (2014). Fast and robust training of recurrent neural networks for offline handwriting recognition. 2014 14th International Conference on Frontiers in Handwriting Recognition (ICFHR), 279–284. https://doi.org/10.1109/ICFHR.2014.51
    Du, W., Fang, M., & Shen, M. (2017). Siamese convolutional neural networks for authorship verification. UNKNOWN. https://api.semanticscholar.org/CorpusID:42153984
    Garg, I., Panda, P., & Roy, K. (2020). A low effort approach to structured cnn design using pca. IEEE Access, 8, 1347–1360. https://doi.org/10.1109/ACCESS.2019.2961960
    Hinton, G., Deng, L., Yu, D., Dahl, G. E., Mohamed,A.-r., Jaitly, N., Senior, A., Vanhoucke, V., Nguyen, P., Sainath, T. N., et al. (2012). Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups. IEEE Signal Processing Magazine, 29(6), 82–97.
    Hinton, G., Vinyals, O., & Dean, J. (2015). Distilling the knowledge in a neural network. https://arxiv.org/abs/1503.02531
    Hochreiter, S., & Schmidhuber, J. (1995). Long short-term memory (tech. rep. No. FKI-20795). Department of Fakultät für Informatik, Technical University of Munich. Munich, Germany.
    Hossain,E., Sharif, O., Hoque,M.,&Sarker,I.(2020).SentiLSTM:ADeepLearningApproach for Sentiment Analysis of Restaurant Reviews. Proceedings of the 20th International Conference on Hybrid Intelligent Systems (HIS).
    Koch, G., Zemel, R., & Salakhutdinov, R. (2015). Siamese neural networks for one-shot image recognition. Siamese Neural Networks for One-shot Image Recognition.
    Mueller, J., & Thyagarajan, A. (2016). Siamese recurrent architectures for learning sentence similarity. ProceedingsoftheThirtiethAAAIConferenceonArtificialIntelligence,27862792.
    Pascanu, R., Mikolov, T., & Bengio, Y. (2013). On the difficulty of training recurrent neural networks. https://arxiv.org/abs/1211.5063
    Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Müller, A., Nothman, J., Louppe, G., Prettenhofer, P., Weiss, R., Dubourg, V., Vander plas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., & Duchesnay, É. (2018). Scikit-learn: Machine learning in python. https://arxiv.org/abs/1201.0490
    Pennington, J., Socher, R., & Manning, C. (2014). GloVe: Global vectors for word representation. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), 1532–1543. https://doi.org/10.3115/v1/D14-1162
    Pontes, E. L., Huet, S., Linhares, A. C., & Torres-Moreno, J.-M. (2018). Predicting the semantic textual similarity with siamese cnn and lstm. https://arxiv.org/abs/1810.10641
    Qi, H., Cao, J., Chen, S., & Zhou, J. (2023). Compressing recurrent neural network models through principal component analysis. Statistics and Its Interface, 16, 397–407. https://doi.org/10.4310/22-SII727
    Shih, C.-H., Yan, B.-C., Liu, S.-H., & Chen, B. (2017). Investigating siamese lstm networks for text categorization. 2017 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), 641–646. https://doi.org/10.1109/APSIPA.2017.8282104
    Srivastava, N. (2013). Improving neural networks with dropout [Master’s thesis]. University of Toronto. https://www.cs.toronto.edu/~hinton/absps/Srivastava-thesis.pdf
    Description: 碩士
    國立政治大學
    統計學系
    112354003
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0112354003
    Data Type: thesis
    Appears in Collections:[統計學系] 學位論文

    Files in This Item:

    File Description SizeFormat
    400301.pdf2691KbAdobe PDF0View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback