政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/112266
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  全文笔数/总笔数 : 113318/144297 (79%)
造访人次 : 51027519      在线人数 : 817
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻


    请使用永久网址来引用或连结此文件: https://nccur.lib.nccu.edu.tw/handle/140.119/112266


    题名: 基於眼動軌跡之閱讀模式分析
    Classification of reading patterns based on gaze information
    作者: 張晉文
    Chang, Chin Wen
    贡献者: 廖文宏
    Liao, Wen Hung
    張晉文
    Chang, Chin Wen
    关键词: 眼動資料
    閱讀模式
    眼動儀
    交叉驗證
    Eye movement
    Reading pattern
    Eye tracker
    Cross-validation
    日期: 2017
    上传时间: 2017-08-28 12:05:34 (UTC+8)
    摘要: 閱讀是吸收知識的途徑,不同的閱讀模式所帶來的閱讀成效也會不同。如何透過機器學習的方式,從凝視點找出閱讀行為的關聯性,將是本研究的目標。實驗選擇低成本眼動儀紀錄讀者閱讀過程中的眼動資料,採用dispersion-based演算法找出凝視點,以計算凝視點特徵,包含凝視時間、凝視距離、凝視位置以及凝視方向。
    本研究將閱讀模式分成五種類別,包含快讀、慢讀、精讀、跳讀與關鍵字識別,透過不同文章的呈現,引導30位測試者遵循其內容進行閱讀,藉此收集不同行為模式的眼動資料。實驗流程中所有的眼動資料會隨機被分成為兩份,依序建立不同維度的訓練資料,由交叉驗證的分類結果找出理想之特徵與維度。以每次挑選6位測試者的眼動數據為測試資料進行5次分類驗證,其平均正確率為78.24%、74.19%、93.75%、87.96%以及96.20%,均達到不錯的分類結果。
    Reading is one of the paths to acquire knowledge. The efficiency is different when different reading patterns are involved. It is the objective of this research to classify reading patterns from fixation data using machine learning techniques. In our experiment, a low-cost eye tracker is employed to record the eye movements during the reading process. A dispersion-based algorithm is implemented to identify fixation from the recorded data. Features pertaining to fixation including duration, path length, landing position and fixation direction are extracted for classification purposes.

    Five categories of reading pattern are defined and investigated in this study, namely, speed reading, slow reading, in-depth reading, skim-and-skip, and keyword spotting. We have recruited thirty subjects to participate in our experiment. The participants are instructed to read different articles using specific styles designated by the experimenter in order to assign label to the collected data. Feature selection is achieved by analyzing the predictive results of cross-validation from the training data obtained from all subjects. The average classification accuracies in five-fold cross-validation are 78.24%, 74.19%, 93.75%, 87.96% and 96.20% using the eye movements of the six randomly selected subjects as test data.
    參考文獻: [1] 王凱平. "移動式眼動儀之實作與視線軌跡分析." 政治大學資訊科學學系學位論文 (2008): 1-97.
    [2] 劉洪瑞, 邱文信, and 劉貞勇. 眼動儀在運動研究之應用.屏東教大體育15 (2012)
    [3] Rayner, Keith, et al. "Eye movements as reflections of comprehension processes in reading." Scientific studies of reading 10.3 (2006): 241-255
    [4] 黃孟隆, 唐大崙, 李執中, 林故廷. 眼動儀於瞳孔測謊之初探.犯罪偵查與鑑識科學研討會報告論文,2004.
    [5] 施懿芳. "行動廣告版面設計對眼球運動與美感情緒影響之研究."交通大學傳播研究所學位論文 (2013): 1-122.
    [6] Morimoto, Carlos Hitoshi, et al. "Pupil detection and tracking using multiple light sources." Image and vision computing 18.4 (2000): 331-335.
    [7] Tracking, Tobii Eye. "An Introduction to eye tracking and Tobii eye-trackers, White Paper." (2010).
    [8] Martínez, José A., et al. "Multimodal system based on electrooculography and voice recognition to control a robot arm." International Journal of Advanced Robotic Systems 10.7 (2013): 283.
    [9] Kim, Myoung Ro, and Gilwon Yoon. "Control signal from EOG analysis and its application." World Academy of Science, Engineering and Technology, International Journal of Electrical, Electronic Science and Engineering 7.10 (2013): 830-834.
    [10] Scleral Search Coils from:
    http://www.audiologyonline.com/articles/ics-impulse-revolutionizing-vestibular-assessment-12003
    [11] Popelka, Stanislav, et al. "Advanced Map Optimalization Based on Eye-Tracking." (2012).
    [12] Hansen, Dan Witzner, and Qiang Ji. "In the eye of the beholder: A survey of models for eyes and gaze." IEEE transactions on pattern analysis and machine intelligence 32.3 (2010): 478-500.
    [13] Methods of measuring eye movements, Electrooculography from:
    https://www.liverpool.ac.uk/~pcknox/teaching/Eymovs/emeth.htm
    [14] Eye monitors, Infra-Red Oculography from:
    http://www.cabiatl.com/mricro/eyemon/index.html
    [15] Rayner, Keith. "Eye movements in reading and information processing: 20 years of research." Psychological bulletin 124.3 (1998): 372.
    [16] Fischler, Martin A., and Robert C. Bolles. "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography." Communications of the ACM 24.6 (1981): 381-395.
    [17] Tobii/developer zone, “Sentry Versus the EyeX” from:
    http://developer.tobii.com/community/forums/topic/sentry-versus-the-eyex/
    [18] Tobii/developer zone, “Fixing Sampling/refresh Rate” from:
    http://developer.tobii.com/community/forums/topic/fixing-samplingrefresh-rate
    [19] The Eye Tribe, “Developers Guide” from:
    https://s3.eu-central-1.amazonaws.com/theeyetribe.com/theeyetribe.com/dev/dev/index.html
    [20] Tobii Tech, “Developer`s Guide:, Tobii EyeX SDK for .NET. p5, September. 2015
    [21] Microsoft, “Visual Studio Community” from:
    https://www.visualstudio.com/zh-hant/vs/community/?rr=https%3A%2F%2Fwww.google.com.tw%2F
    [22] The Eye Tribe, “Getting Started” from:
    https://s3.eu-central-1.amazonaws.com/theeyetribe.com/theeyetribe.com/dev/start/index.html#setup
    [23] Martinez-Conde, Susana, Stephen L. Macknik, and David H. Hubel. "The role of fixational eye movements in visual perception." Nature Reviews Neuroscience 5.3 (2004): 229-240.
    [24] 韓承靜, and 蔡介立. "眼球軌跡記錄—科學學習研究的明日之星."科學教育310008): 2-11.
    [25] Aga Bojko, EYE TRACKING THE USER EXPERIENCE A Practical Guide To Research, Rosenfeld Media, 2013.
    [26] Smallpdf from: https://smallpdf.com/zh-TW/pdf-to-jpg
    [27] Chang, Chih-Chung, and Chih-Jen Lin. "LIBSVM: a library for support vector machines." ACM Transactions on Intelligent Systems and Technology (TIST) 2.3 (2011): 27. Software available at https://www.csie.ntu.edu.tw/~cjlin/libsvm/
    描述: 碩士
    國立政治大學
    資訊科學系碩士在職專班
    102971021
    資料來源: http://thesis.lib.nccu.edu.tw/record/#G0102971021
    数据类型: thesis
    显示于类别:[資訊科學系碩士在職專班] 學位論文

    文件中的档案:

    档案 大小格式浏览次数
    102101.pdf5261KbAdobe PDF21090检视/开启


    在政大典藏中所有的数据项都受到原著作权保护.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 回馈