政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/37119
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  全文筆數/總筆數 : 113392/144379 (79%)
造訪人次 : 51205353      線上人數 : 911
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/37119
    請使用永久網址來引用或連結此文件: https://nccur.lib.nccu.edu.tw/handle/140.119/37119


    題名: 以視線軌跡為基礎的人機介面
    Gaze-based human-computer interaction
    作者: 余立強
    Yu, Li Chiang
    貢獻者: 廖文宏
    Liao, Wen Hung
    余立強
    Yu, Li Chiang
    關鍵詞: 眼球追蹤
    人機介面
    瞳孔偵測
    Eye Tracking
    HCI
    Pupil Detection
    日期: 2008
    上傳時間: 2009-09-19 12:11:37 (UTC+8)
    摘要: 眼動儀目前的主要用途在分析使用者的觀看行為,藉以改善介面的設計,或幫助身體有缺陷但眼睛還能轉動的使用者與外界溝通。隨著相關技術的發展,眼動儀將可能如同滑鼠、鍵盤一般,成為使用者輸入裝置的選項。本論文的目的在於設計並實作低成本之穿戴式與遠距眼動儀,並將其應用於以視線軌跡為基礎的人機介面,希望能夠增進人與電腦之間的互動方式。由於眼動儀會受到雜訊與角膜反射點等的影響,本研究提出利用瞳孔周圍暗色點比例較高的特性,增加定位之準確性,以改善眼動儀之精確度,此外,頭部的移動亦會造成眼動儀在計算投射位置時之誤差,本研究也針對這個問題提出因應之解決方案。利用前述製作的眼動儀,本論文實作數個以視線軌跡為基礎的人機介面,包括視線軌跡網頁瀏覽器、強化眼睛注視照片區域、井字遊戲、互動式媒體等,並利用眼動儀記錄使用者觀看手機介面的行為。
    Eye tracker, a device for measuring eye position and movements, has traditionally been used for research in human visual system, psychology and interface design. It has also served as an input device for people with disability. With recent progresses in hardware and imaging technology, it has the potential to complement, even replace popular devices such as mouse or keyboard for average users to communicate with the computer. It is the objective of this research to design and implement low-cost head-mounted and remote eye trackers and subsequently develop applications that take advantage of gaze-based interactions. Specifically, we improve the precision of the tracking result by designing a new pupil detection algorithm as well as compensating for head movement. We then present several gaze-based user interfaces, including eye-controlled web browser, attention-based photo browser, interactive game (tic-tac-toe) and media design. We also investigate the feasibility of utilizing the eye trackers to analyze and evaluate the design of mobile user interface.
    參考文獻: [1] Emiliano Castellina, Faisal Razzak, Fulvio Corno, “Environmental Control Application Compliant with Cogain Guidelines,” The 5h Conference on Communication by Gaze Interaction(COGAIN 2009), 2009.
    [2] Jacob O. Wobbrock, James Rubinstein, Michael Sawyer, Andrew T. Duchowski, “Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures,” The 3h Conference on Communication by Gaze Interaction(COGAIN 2007), 2007.
    [3] Howell Istance, Aulikki Hyrskykari, Stephen Vickers, Nazmie Ali, “User Performance of Gaze-Based Interaction with On-line Virtual Communities,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008.
    [4] Stephen Vickers, Howell Istance, Aulikki Hyrskykari, “Selecting Commands in 3D Game Environments by Gaze Gestures,” The 5h Conference on Communication by Gaze Interaction(COGAIN 2009), 2009.
    [5] 王凱平, 移動式眼動儀之實作與視線軌跡分析, 政治大學資訊科學系碩士班碩士論文, 2008.
    [6] Droege Detlev, Schmidt Carola, Paulus Dietrich, “A Comparison of Pupil Centre Estimation,” The 4th Conference on Communication by Gaze Interaction(COGIN 2008), 2008.
    [7] Flavio Luiz Coutinho, Carlos Hitoshi Morimoto, "Free head Motion Eye Gaze Tracking Using a Single Camera and Multiple Light Sources," pp.171-178, XIX Brazilian Symposium on Computer Graphics and Image Processing (SIBGRAPI`06), 2006.
    [8] Craig Hennessey, Borna Noureddin, Peter Lawrence, “A Single Camera Eye-Gaze Tracking System with Free Head Motion,” pp.87-94, Proceedings of the 2006 symposium on Eye tracking research & applications (ETRA’06), 2006.
    [9] Dongheng Li, David Winfield, Derrick J. Parkhurst, “Starburst: A Hybrid Algorithm for Video-Based Eye Tracking Combining Feature-Based and Model-Based Approaches,” 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPRW`05) Workshops, pp. 79, 2005.
    [10] Gintautas Daunys, Nerijus Ramanauskas, “The Accuracy of Eye Tracking Using Image Processing,” pp.377-380, Proceedings of the Third Nordic Conference on Human-Computer Interaction, 2004.
    [11] Dongshi Xia, Zongcai Ruan, "IR Image Based Eye Gaze Estimation," vol. 1, pp.220-224, Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD 2007), 2007.
    [12] Sepehr Attarchi, Karim Faez, Amin Asghari, "A Fast and Accurate Iris Recognition Method Using the Complex Inversion Map and 2DPCA," pp.179-184, Seventh IEEE/ACIS International Conference on Computer and Information Science (ICIS 2008), 2008.
    [13] Craig Hennessey, ”Eye-Gaze Tracking With Free Head Motion,” Masters of Applied Science Thesis, University of British Columbia, 2005.
    [14] Somnath Dey, Debasis Samanta, “An Efficient Approach for Pupil Detection in Iris Images,” pp.382-389, 15th International Conference on Advanced Computing and Communications (ADCOM 2007), 2007.
    [15] Cudel Christophe, Bernet Sacha, Basset Michel. “Fast and Easy Calibration for a Head-Mounted Eye Tracker,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008.
    [16] Cutrell Edward, Guan Zhiwei, “An Eye Tracking Study of the Effect of Target Rank on Web Search.” Proceedings of the SIGCHI conference on Human Factors in Computing Systems, 2007.
    [17] Petr Novák, Tomáš Krajník, Libor Přeučil, Marcela Fejtová, Olga Štěpánková, “AI Support for a Gaze Controlled Wheelchair,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008.
    [18] Matt Feusner, Brian Lukoff, “Testing for Statistically Significant Differences between Groups of Scan Patterns,” pp.43-46, Proceedings of the 2008 Symposium on Eye tracking Research & Applications(ETRA’08), 2008.
    [19] Nguyen Van Huan, Hakil Kim, "A Novel Circle Detection Method for Iris Segmentation," cisp, vol. 3, pp.620-624, 2008 Congress on Image and Signal Processing, Vol. 3, 2008.
    [20] Craig Hennessey, Borna Noureddin, Peter Lawrence,“A Single Camera Eye-Gaze Tracking System with Free Head Motion,” pp.87-94, Proceedings of the 2006 Symposium on Eye tracking Research & Applications(ETRA’06), 2006.
    [21] Zhu Zhiwei, Ji Qiang, “Eye Gaze Tracking Under Natural Head Movements,” Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition(CVPR’05), 2005.
    [22] Teodora Vatahska, Maren Bennewitz, Sven Behnke, ”Feature-Based Head Pose Estimation from Images,” Proceedings of IEEE-RAS 7th International Conference on Humanoid Robots (Humanoids), 2007.
    [23] Skovsgaard H.T.Henrik, Hansen Paulin John, Mateo C.Julio, “How Can Tiny Buttons Be Hit Using Gaze Only?” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008). 2008.
    [24] Laura Cowen, “An Eye Movement Analysis of Web-Page Usability,” Unpublished Masters’ thesis, Lancaster University, UK, 2001.
    [25] Reeder W.Robert, Pirolli Peter, Card K.Stuart, “WebEyeMapper and WebLogger: Tools for Analyzing Eye Tracking Data Collected in Web-Use Studies,” pp.19-20, CHI `01 Extended Abstracts on Human Factors in Computing Systems, 2001.
    描述: 碩士
    國立政治大學
    資訊科學學系
    96753025
    97
    資料來源: http://thesis.lib.nccu.edu.tw/record/#G0096753025
    資料類型: thesis
    顯示於類別:[資訊科學系] 學位論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    302501.pdf276KbAdobe PDF2956檢視/開啟
    302502.pdf117KbAdobe PDF21079檢視/開啟
    302503.pdf103KbAdobe PDF2943檢視/開啟
    302504.pdf185KbAdobe PDF21051檢視/開啟
    302505.pdf156KbAdobe PDF21320檢視/開啟
    302506.pdf327KbAdobe PDF22722檢視/開啟
    302507.pdf691KbAdobe PDF21960檢視/開啟
    302508.pdf408KbAdobe PDF21158檢視/開啟
    302509.pdf1116KbAdobe PDF21254檢視/開啟
    302510.pdf139KbAdobe PDF21171檢視/開啟
    302511.pdf224KbAdobe PDF21205檢視/開啟


    在政大典藏中所有的資料項目都受到原著作權保護.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 回饋