Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/141180
|
Title: | 應用深度學習於股票走勢分析-以台灣市場為例 Applying Deep Learning to Predict the Trend of Stock in Taiwan |
Authors: | 周家民 Zhou, Jia-Min |
Contributors: | 蔡炎龍 Tsai, Yen-Lung 周家民 Zhou, Jia-Min |
Keywords: | 深度學習 神經網路 卷積神經網路 長短期記憶 股票趨勢預測 市場模擬 Deep Learning NN CNN LSTM Stock Trend Forecast Market Simulation |
Date: | 2022 |
Issue Date: | 2022-08-01 18:12:42 (UTC+8) |
Abstract: | 在本篇論文中,我們使用了現有的 NN、CNN、LSTM 等模型去組合出一個更為複雜的合併模型,並使用新的前處理方法處理技術指標,透過預設的閥值或一些條件轉成新的指標。此外,還使用了一些較為新穎的技術來改善模型,例如:LeakyReLU、Nadam,讓模型更好訓練。與其他模型相比,在同樣的輸入下,合併模型大幅度優於其他的模型,也遠高於最簡單的預測方法。而加入前處理的指標後,更讓原本的合併模型以及 LSTM 模型的準確率分別提升了 4.13% 以及 8.54%。 除了單純模型預測外,我們也提出一個簡單的策略來應用模型的預測,並預設了一個閥值來達到更好的結果。扣除掉手續費、交易稅後,最多大約可以得到 7% 的回報。 In this paper, we use existing NN, CNN and LSTM models to combine a more complex merged model and use new preprocessing methods to handle the technical indicators, which are transformed into new indicators by pre-set thresholds or some conditions. In addition, some newer techniques are used to improve the model, such as LeakyReLU and Nadam, to make the model better trained. Compared with other models, the merged model is substantially better than other models with the same inputs and much better than the simplest prediction method. The addition of the preprocessing indicators also improved the accuracy of the original merged model and LSTM model by 4.13% and 8.54%, respectively. In addition to the pure model prediction, we also propose a simple strategy to apply the model prediction with a pre-set threshold to achieve better results. The maximum return is about 7% after deducting the handling fee and transaction tax. |
Reference: | [1] Kunihiko Fukushima and Sei Miyake. Neocognitron: A self-organizing neural network model for a mechanism of visual pattern recognition. In Competition and cooperation in neural nets, pages 267–285. Springer, 1982. [2] Geoffrey E Hinton, Nitish Srivastava, Alex Krizhevsky, Ilya Sutskever, and Ruslan RSalakhutdinov. Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:1207.0580, 2012. [3] David H Hubel. Single unit activity in striate cortex of unrestrained cats. The Journal of physiology, 147(2):226, 1959. [4] David H Hubel and Torsten N Wiesel. Receptive fields of single neurones in the cat’s striate cortex. The Journal of physiology, 148(3):574, 1959. [5] WS McCullock and W Pitts. A logical calculus of ideas immanent in nervous activity. archive copy of 27 november 2007 on wayback machine. Avtomaty [Automated Devices] Moscow, Inostr. Lit. publ, pages 363–384, 1956. [6] David Silver, Aja Huang, Chris J Maddison, Arthur Guez, Laurent Sifre, George Van Den Driessche, Julian Schrittwieser, Ioannis Antonoglou, Veda Panneershelvam, Marc Lanctot, et al. Mastering the game of go with deep neural networks and tree search. nature, 529(7587):484–489, 2016. [7] Bing Xu, Naiyan Wang, Tianqi Chen, and Mu Li. Empirical evaluation of rectified activations in convolutional network. arXiv preprint arXiv:1505.00853, 2015. |
Description: | 碩士 國立政治大學 應用數學系 108751018 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0108751018 |
Data Type: | thesis |
DOI: | 10.6814/NCCU202200774 |
Appears in Collections: | [應用數學系] 學位論文
|
Files in This Item:
File |
Description |
Size | Format | |
101801.pdf | | 1042Kb | Adobe PDF2 | 0 | View/Open |
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|