Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/125530
|
Title: | 應用RNN於股價漲跌預測之研究 Applying Recurrent Neural Networks to Stock Price Prediction |
Authors: | 李俊逸 Lee, Chun-Yi |
Contributors: | 梁定澎 周彥君 Liang, Ting-Peng Chou, Yen-Chun 李俊逸 Lee, Chun-Yi |
Keywords: | 股價預測 遞歸神經網路 深度學習 人工智慧 Recurrent Neural Network Stock Price Prediction Deep learning Artificial Intelligence |
Date: | 2019 |
Issue Date: | 2019-09-05 15:44:55 (UTC+8) |
Abstract: | 股票是現代人投資理財的重要工具,藉由投資股票可以成為公司的股東直接參與並分享公司的成長,並從中獲利。股票的價格受到了短期與長期的商業、交易活動影響,這些影響的模式往往非常難以預估,因為股價往往受到了現實中許多不確定的政治、經濟因素影響,例如企業績效、政府政策、跨國家的突發事件新聞,又或者說川普上台後的美股就是一例。此外,股票價格的時間序列是非線性、非固定模式的,因此對未來的股票價格做預測非常具有挑戰性的。 為了解決這個問題,本研究嘗試以「價值投資」的角度,使用深度學習當中的的遞歸神經網路(Recurrent Neural Network)來捕捉過去市場的交易模式,隨著時間的推移對於股價做「長期預測」。也由於影響股價的因素非常的多,包括了許多已被基金金理人、投資專家、或者是一般股民廣泛利用的指標,本研究使用深度學習的遞歸神經網路當作架構,根據交易量(流動性)、市值、選擇了台灣的大型股票台積電(2330.TW)、鴻海(2317.TW)、聯發科(2454.TW)、大立光(3008.TW)以及台灣50(0050.TW)做漲跌的預測。透過遞歸神經網路(RNN)自我學習特徵的方式,本研究將比較不同特徵及參數設定的影響,包括(1)神經元數目 (2)隱藏層數目(3)不同預測週期(4)不同的標準化方法(5)不同的指標(如,財務報表指標、技術分析指標、基本分析指標、股市交易資料、總體經濟資料) (6)運用優良之預測模型於股票市場之獲利率。 With the rapid growth of computing equipment in Moore`s Law and the rapid development of computing devices, humans now have computers with faster speeds, which makes artificial intelligence rise again. The reason is that the branch of artificial intelligence — deep learning becomes dominant, and many people are working on how to implement deep learning skill on difficult questions and make contribution to human society. This study attempts to use one of the deep learning skills, RNN (Recurrent Neural Network) to make predictions about the stock market. The factors affecting the stock market are very many, including many indicators that have been widely used by fund managers, investment experts, or general investors. This study uses RNN (Recurrent Neural Network) as an architecture. Based on the transaction volume (liquidity) and market value, choose from Top 50 largest company, this research chooses Taiwan Semiconductor Manufacturing Company (2330.TW), Foxconn Technology Group (2317.TW), MediaTek Inc. (2454.TW), Largan Precision Co., Ltd (3008.TW) as predicting the target. Through the self-learning recurrent neural network (RNN), we use the LSTM model in order to make useful predictions. This study compares the influence on (1) Number of neurons (2) the number of hidden layers (3) For how long or how many months backwards are the excellent periods to forecast next month(4) Different standardization methods (5) Different indicators (financial statement indicators, technical analysis indicators, fundamental analysis indicators, stock market trading data, macroeconomics data) and do One-hot Encoding on months to see the seasonal influence on market and make predictions. |
Reference: | [1] W. Awad and S. ELseuofi, “Machine learning methods for e-mail classification,” International Journal of Computer Applications, vol. 16, no. 1, 2011.
[2] F. Sebastiani, “Machine learning in automated text categorization,” ACM computing surveys (CSUR), vol. 34, no. 1, pp. 1–47, 2002.
[3] W. B. Rauch-Hindin, Artificial Intelligence in Business, Science, and Industry: Fundamentals. Prentice-Hall New Jersey, 1986.
[4] K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” arXiv preprint arXiv:1409.1556, 2014.
[5] P. S. Churchland, T. J. Sejnowski, and T. A. Poggio, The computational brain. MIT press, 2016.
[6] Y. LeCun, Y. Bengio et al., “Convolutional networks for images, speech, and time series,” The handbook of brain theory and neural networks, vol. 3361, no. 10, p. 1995, 1995.
[7] O. Abdel-Hamid, A.-r. Mohamed, H. Jiang, L. Deng, G. Penn, and D. Yu, “Convolutional neural networks for speech recognition,” IEEE/ACM Transactions on audio, speech, and language processing, vol. 22, no. 10, pp. 1533–1545, 2014.
[8] J. Dai, Y. Lu, and Y.-N. Wu, “Generative modeling of convolutional neural networks,” arXiv preprint arXiv:1412.6296, 2014.
[9] J. Xie, Y. Lu, S.-C. Zhu, and Y. Wu, “A theory of generative convnet,” in International Conference on Machine Learning, 2016, pp. 2635–2644.
[10] J. Xie, Y. Lu, R. Gao, S.-C. Zhu, and Y. N. Wu, “Cooperative training of descriptor and generator networks,” arXiv preprint arXiv:1609.09408, 2016.
[11] I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” in Advances in neural information processing systems, 2014, pp. 3104–3112. 5
[12] A. Graves, A.-r. Mohamed, and G. Hinton, “Speech recognition with deep recurrent neural networks,” in Acoustics, speech and signal processing (icassp), 2013 ieee international conference on. IEEE, 2013, pp. 6645–6649.
[13] S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural computation, vol. 9, no. 8, pp. 1735–1780, 1997.
[14] F. A. Gers, J. Schmidhuber, and F. Cummins, “Learning to forget: Continual prediction with lstm,” 1999.
[15] Tay, F. E., & Cao, L. (2001). Application of support vector machines in financial time series forecasting. Omega, 29(4), 309–317.
[16] Teixeira, L. A., & Oliveira, A. L. I. d. (2010). A method for automatic stock trading combining technical analysis and nearest neighbor classification. Expert Systems 1902 with Applications, 6885–6890.
[17] Wang, J.-Z., Wang, J.-J., Zhang, Z.-G., & Guo, S.-P. (2011). Forecasting stock indices with back propagation neural network. Expert Systems with Applications, 38(11), 14346–14355.
[18] Kumar, D., & Murugan, S. (2013). Performance analysis of Indian stock market index using neural network time series model. In International conference on pattern recognition, informatics and mobile engineering (pp. 72–78).
[19] Si, Y.-W., & Yin, J. (2013). OBST-based segmentation approach to financial time series. Engineering Applications of Artificial Intelligence, 26(10), 2581–2596.
[20] Lee, H., Grosse, R., Ranganath, R., & Ng, A. Y. (2009). Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations. In Proceedings of the 26th annual international conference on machine learning (pp. 609–616). ACM.
[21] Atsalakis, G. S., & Valavanis, K. P. (2009). Surveying stock market forecasting techniques–Part II: Soft computing methods. Expert Systems with Applications, 36(3), 5932–5941.
[22] 吳哲緯. (2017). 使用深度學習卷積神經網路預測股票買賣策略之分類研究. 中山大學資訊管理學系研究所學位論文, 1-74.
[23] Lipton, Zachary C. "A Critical Review of Recurrent Neural Networks for Sequence Learning." arXiv preprint arXiv:1506.00019 (2015).
[24] Hsieh, T. J., Hsiao, H. F., & Yeh, W. C. (2011). Forecasting stock markets using wavelet transforms and recurrent neural networks: An integrated system based on artificial bee colony algorithm. Applied soft computing, 11(2), 2510-2525.
[25] Hochreiter, Sepp, and Jürgen Schmidhuber. "Long short-term memory." Neural computation 9.8 (1997): 1735-1780.
[26] Bekiros, S. D., & Georgoutsos, D. A. (2008). Direction‐of‐change forecasting using a volatility‐based recurrent neural network. Journal of Forecasting, 27(5), 407-417.
[27] Hsieh, T. J., Hsiao, H. F., & Yeh, W. C. (2011). Forecasting stock markets using wavelet transforms and recurrent neural networks: An integrated system based on artificial bee colony algorithm. Applied soft computing, 11(2), 2510-2525.
[28] Hung, S. Y., Liang, T. P., & Liu, V. W. C. (1996). Integrating arbitrage pricing theory and artificial neural networks to support portfolio management. Decision support systems, 18(3-4), 301-316.
[29] Tilakaratne, C. D., Mammadov, M. A., & Morris, S. A. (2009). Modified neural network algorithms for predicting trading signals of stock market indices. Advances in Decision Sciences, 2009.
[30] Tsai, C. F., & Hsiao, Y. C. (2010). Combining multiple feature selection methods for stock prediction: Union, intersection, and multi-intersection approaches. Decision Support Systems, 50(1), 258-269
[31] Emir, S., Dinçer, H., & Timor, M. (2012). A stock selection model based on fundamental and technical analysis variables by using artificial neural networks and support vector machines
[32] Nayak, S. C., Misra, B. B., & Behera, H. S. (2014). Impact of data normalization on stock index forecasting. Int. J. Comp. Inf. Syst. Ind. Manag. Appl, 6, 357-369.
[33] Lehtinen, J. (1996). Financial ratios in an international comparison: Validity and reliability.
[34] Nayak, S. C., Misra, B. B., & Behera, H. S. (2014). Impact of data
[35] Han, J., Kamber, M., & Pei, J. (2011). Data mining: Concepts and techniques. Elsevier.
[36] Hastie, T. J., Tibshirani, R. J., & Friedman, J. H. (2011). The elements of statistical learning: Data mining, inference, and prediction. Springer.
[37] Bradley, A. P. (1997). The use of the area under the ROC curve in the evaluation of machine learning algorithms. Pattern recognition, 30(7), 1145-1159.
[38] Hanley, J.A. & McNeil, B.J., The Meaning and Use of the Area under a Receiver Operating Characteristic (ROC) Curve. Radiology 143 (1): 29- 36, 1982.
[39] Metz, C.E., Herman, B.A. & Shen, J., Maximum Likelihood Estimation of Receiver Operating Characteristic (ROC) Curves from Continuously- Distributed Data. Statistics in Medicine 17: 1033-1053, 1998.
[40] Zhou, X.H., McClish, D.K. & Obuchowski, N.A., Statistical Methods in Diagnostic Medicine, Wiley, 2002.
[41] Weng, B., Ahmed, M. A., & Megahed, F. M. (2017). Stock market one-day ahead movement prediction using disparate data sources. Expert Systems with Applications, 79, 153-163. |
Description: | 碩士 國立政治大學 資訊管理學系 106356024 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0106356024 |
Data Type: | thesis |
DOI: | 10.6814/NCCU201900772 |
Appears in Collections: | [資訊管理學系] 學位論文
|
Files in This Item:
File |
Size | Format | |
602401.pdf | 2019Kb | Adobe PDF2 | 0 | View/Open |
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|