Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/152819
|
Title: | 時間序列生成模型應用於股票走勢預測 Time series generative modeling applied to stock trend prediction |
Authors: | 林宜佑 Lin, Yi-Yu |
Contributors: | 蔡炎龍 Tsai, Yen-Lung 林宜佑 Lin, Yi-Yu |
Keywords: | 深度學習 資料增強 TSGM 時間序列 卷積神經網路 長短期記憶神經網路 孿生神經網路 對比學習 股票走勢預測 Deep Learning Data Augmentation TSGM Time Series CNN LSTM Siamese Networks Contrastive Learning Stock Trend Prediction |
Date: | 2024 |
Issue Date: | 2024-08-05 14:11:59 (UTC+8) |
Abstract: | 資料增強是深度學習中的關鍵技術和議題。深度學習依賴大量且多樣化的資料來訓練模型,透過人工技術對原始資料進行微幅變化以增加資料量是常用的方法。在時間序列資料方面,資料增強同樣適用,但針對時間序列的資料增強技術目前並不常見,本文嘗試將股票資訊作為時間序列數據,使用時間序列生成模型(TSGM)進行數據增強,並對幾種常見的時間序列數據增強方法進行了比較。我們利用長短期記憶網路、卷積神經網路和孿生對比學習進行預測,在比較它們的結果後,我們發現以股票走勢預測方面來說,對比學習的效果相對突出,而且每種模型在經過資料增強後,預測表現都有所提升。 Data augmentation is a key technique and topic in deep learning. Deep learning relies on large and diverse datasets to train models. Using manual techniques to slightly alter the original data to increase its size is a common approach. In time series data, data augmentation is also applicable, although it's not widely used. In this article, we used stock information as time series data and applied time series generative modeling (TSGM) for data augmentation. We compared several common time series data augmentation methods. To make predictions, we used long short-term memory network (LSTM), convolutional neural network (CNN) and Siamese contrastive learning. After comparing their results, we found that, for stock trend prediction, contrastive learning was relatively effective. Every model showed improved prediction performance after data augmentation. |
Reference: | [1] Yihao Ang, Qiang Huang, Yifan Bao, Anthony KH Tung, and Zhiyong Huang. Tsgbench:Time series generation benchmark. arXiv preprint arXiv:2309.03755, 2023. [2] Jane Bromley, Isabelle Guyon, Yann LeCun, Eduard Säckinger, and Roopak Shah. Signature verification using a” siamese” time delay neural network. Advances in neural information processing systems, 6, 1993. [3] Ting Chen, Simon Kornblith, Mohammad Norouzi, and Geoffrey Hinton. A simple framework for contrastive learning of visual representations. In International conference on machine learning, pages 1597–1607. PMLR, 2020. [4] Xinlei Chen and Kaiming He. Exploring simple siamese representation learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 15750–15758, 2021. [5] Jeffrey L Elman. Finding structure in time. Cognitive science, 14(2):179–211, 1990. [6] Thomas Epelbaum. Deep learning: Technical introduction. arXiv preprint arXiv:1709.01412, 2017. [7] Kunihiko Fukushima. Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biological cybernetics, 36(4):193–202, 1980. [8] Ian J. Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio. Generative adversarial networks, 2014. [9] A Graves, M Liwicki, S Fernandez, R Bertolami, H Bunke, and J Schmidhuber. A novel connectionist system for improved unconstrained handwriting. IEEE Transactions on Pattern Analysis and Machine Intelligence, 31(5), 2009. [10] Jean-Bastien Grill, Florian Strub, Florent Altché, Corentin Tallec, Pierre Richemond, Elena Buchatskaya, Carl Doersch, Bernardo Avila Pires, Zhaohan Guo, Mohammad Gheshlaghi Azar, et al. Bootstrap your own latent-a new approach to self-supervised learning. Advances in neural information processing systems, 33:21271–21284, 2020. [11] Kaiming He, Haoqi Fan, Yuxin Wu, Saining Xie, and Ross Girshick. Momentum contrast for unsupervised visual representation learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 9729–9738, 2020. [12] Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory. Neural computation, 9(8):1735–1780, 1997. [13] Diederik P Kingma and Max Welling. Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114, 2013. [14] Arthur Le Guennec, Simon Malinowski, and Romain Tavenard. Data augmentation for time series classification using convolutional neural networks. In ECML/PKDD workshop on advanced analytics and learning on temporal data, 2016. [15] Yann LeCun, Bernhard Boser, John S Denker, Donnie Henderson, Richard E Howard, Wayne Hubbard, and Lawrence D Jackel. Backpropagation applied to handwritten zip code recognition. Neural computation, 1(4):541–551, 1989. [16] Alexander Nikitin, Letizia Iannucci, and Samuel Kaski. Tsgm: A flexible framework for generative modeling of synthetic time series. arXiv preprint arXiv:2305.11567, 2023. [17] Khandakar M Rashid and Joseph Louis. Window-warping: A time series data augmentation of imu data for construction equipment activity identification. In ISARC. Proceedings of the international symposium on automation and robotics in construction, volume 36, pages 651–657. IAARC Publications, 2019. [18] Frank Rosenblatt. The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review, 65(6):386, 1958. [19] David E Rumelhart, Geoffrey E Hinton, and Ronald J Williams. Learning representations by back-propagating errors. nature, 323(6088):533–536, 1986. [20] David Silver, Julian Schrittwieser, Karen Simonyan, Ioannis Antonoglou, Aja Huang, Arthur Guez, Thomas Hubert, Lucas Baker, Matthew Lai, Adrian Bolton, et al. Mastering the game of go without human knowledge. nature, 550(7676):354–359, 2017. [21] HR Xu, Bo Xu, and KW Xu. Analysis on application of machine learning in stock forecasting. Computer Engineering and Applications, 56(12):19–24, 2020. [22] Jan Yen. Time series representation learning for stock market prediction. 2021. |
Description: | 碩士 國立政治大學 應用數學系 111751004 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0111751004 |
Data Type: | thesis |
Appears in Collections: | [應用數學系] 學位論文
|
Files in This Item:
File |
Description |
Size | Format | |
100401.pdf | | 2317Kb | Adobe PDF | 0 | View/Open |
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|