Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/112266
|
Title: | 基於眼動軌跡之閱讀模式分析 Classification of reading patterns based on gaze information |
Authors: | 張晉文 Chang, Chin Wen |
Contributors: | 廖文宏 Liao, Wen Hung 張晉文 Chang, Chin Wen |
Keywords: | 眼動資料 閱讀模式 眼動儀 交叉驗證 Eye movement Reading pattern Eye tracker Cross-validation |
Date: | 2017 |
Issue Date: | 2017-08-28 12:05:34 (UTC+8) |
Abstract: | 閱讀是吸收知識的途徑,不同的閱讀模式所帶來的閱讀成效也會不同。如何透過機器學習的方式,從凝視點找出閱讀行為的關聯性,將是本研究的目標。實驗選擇低成本眼動儀紀錄讀者閱讀過程中的眼動資料,採用dispersion-based演算法找出凝視點,以計算凝視點特徵,包含凝視時間、凝視距離、凝視位置以及凝視方向。 本研究將閱讀模式分成五種類別,包含快讀、慢讀、精讀、跳讀與關鍵字識別,透過不同文章的呈現,引導30位測試者遵循其內容進行閱讀,藉此收集不同行為模式的眼動資料。實驗流程中所有的眼動資料會隨機被分成為兩份,依序建立不同維度的訓練資料,由交叉驗證的分類結果找出理想之特徵與維度。以每次挑選6位測試者的眼動數據為測試資料進行5次分類驗證,其平均正確率為78.24%、74.19%、93.75%、87.96%以及96.20%,均達到不錯的分類結果。 Reading is one of the paths to acquire knowledge. The efficiency is different when different reading patterns are involved. It is the objective of this research to classify reading patterns from fixation data using machine learning techniques. In our experiment, a low-cost eye tracker is employed to record the eye movements during the reading process. A dispersion-based algorithm is implemented to identify fixation from the recorded data. Features pertaining to fixation including duration, path length, landing position and fixation direction are extracted for classification purposes.
Five categories of reading pattern are defined and investigated in this study, namely, speed reading, slow reading, in-depth reading, skim-and-skip, and keyword spotting. We have recruited thirty subjects to participate in our experiment. The participants are instructed to read different articles using specific styles designated by the experimenter in order to assign label to the collected data. Feature selection is achieved by analyzing the predictive results of cross-validation from the training data obtained from all subjects. The average classification accuracies in five-fold cross-validation are 78.24%, 74.19%, 93.75%, 87.96% and 96.20% using the eye movements of the six randomly selected subjects as test data. |
Reference: | [1] 王凱平. "移動式眼動儀之實作與視線軌跡分析." 政治大學資訊科學學系學位論文 (2008): 1-97. [2] 劉洪瑞, 邱文信, and 劉貞勇. 眼動儀在運動研究之應用.屏東教大體育15 (2012) [3] Rayner, Keith, et al. "Eye movements as reflections of comprehension processes in reading." Scientific studies of reading 10.3 (2006): 241-255 [4] 黃孟隆, 唐大崙, 李執中, 林故廷. 眼動儀於瞳孔測謊之初探.犯罪偵查與鑑識科學研討會報告論文,2004. [5] 施懿芳. "行動廣告版面設計對眼球運動與美感情緒影響之研究."交通大學傳播研究所學位論文 (2013): 1-122. [6] Morimoto, Carlos Hitoshi, et al. "Pupil detection and tracking using multiple light sources." Image and vision computing 18.4 (2000): 331-335. [7] Tracking, Tobii Eye. "An Introduction to eye tracking and Tobii eye-trackers, White Paper." (2010). [8] Martínez, José A., et al. "Multimodal system based on electrooculography and voice recognition to control a robot arm." International Journal of Advanced Robotic Systems 10.7 (2013): 283. [9] Kim, Myoung Ro, and Gilwon Yoon. "Control signal from EOG analysis and its application." World Academy of Science, Engineering and Technology, International Journal of Electrical, Electronic Science and Engineering 7.10 (2013): 830-834. [10] Scleral Search Coils from: http://www.audiologyonline.com/articles/ics-impulse-revolutionizing-vestibular-assessment-12003 [11] Popelka, Stanislav, et al. "Advanced Map Optimalization Based on Eye-Tracking." (2012). [12] Hansen, Dan Witzner, and Qiang Ji. "In the eye of the beholder: A survey of models for eyes and gaze." IEEE transactions on pattern analysis and machine intelligence 32.3 (2010): 478-500. [13] Methods of measuring eye movements, Electrooculography from: https://www.liverpool.ac.uk/~pcknox/teaching/Eymovs/emeth.htm [14] Eye monitors, Infra-Red Oculography from: http://www.cabiatl.com/mricro/eyemon/index.html [15] Rayner, Keith. "Eye movements in reading and information processing: 20 years of research." Psychological bulletin 124.3 (1998): 372. [16] Fischler, Martin A., and Robert C. Bolles. "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography." Communications of the ACM 24.6 (1981): 381-395. [17] Tobii/developer zone, “Sentry Versus the EyeX” from: http://developer.tobii.com/community/forums/topic/sentry-versus-the-eyex/ [18] Tobii/developer zone, “Fixing Sampling/refresh Rate” from: http://developer.tobii.com/community/forums/topic/fixing-samplingrefresh-rate [19] The Eye Tribe, “Developers Guide” from: https://s3.eu-central-1.amazonaws.com/theeyetribe.com/theeyetribe.com/dev/dev/index.html [20] Tobii Tech, “Developer`s Guide:, Tobii EyeX SDK for .NET. p5, September. 2015 [21] Microsoft, “Visual Studio Community” from: https://www.visualstudio.com/zh-hant/vs/community/?rr=https%3A%2F%2Fwww.google.com.tw%2F [22] The Eye Tribe, “Getting Started” from: https://s3.eu-central-1.amazonaws.com/theeyetribe.com/theeyetribe.com/dev/start/index.html#setup [23] Martinez-Conde, Susana, Stephen L. Macknik, and David H. Hubel. "The role of fixational eye movements in visual perception." Nature Reviews Neuroscience 5.3 (2004): 229-240. [24] 韓承靜, and 蔡介立. "眼球軌跡記錄—科學學習研究的明日之星."科學教育310008): 2-11. [25] Aga Bojko, EYE TRACKING THE USER EXPERIENCE A Practical Guide To Research, Rosenfeld Media, 2013. [26] Smallpdf from: https://smallpdf.com/zh-TW/pdf-to-jpg [27] Chang, Chih-Chung, and Chih-Jen Lin. "LIBSVM: a library for support vector machines." ACM Transactions on Intelligent Systems and Technology (TIST) 2.3 (2011): 27. Software available at https://www.csie.ntu.edu.tw/~cjlin/libsvm/ |
Description: | 碩士 國立政治大學 資訊科學系碩士在職專班 102971021 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0102971021 |
Data Type: | thesis |
Appears in Collections: | [資訊科學系碩士在職專班] 學位論文
|
Files in This Item:
File |
Size | Format | |
102101.pdf | 5261Kb | Adobe PDF2 | 1090 | View/Open |
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|