Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/37119
|
Title: | 以視線軌跡為基礎的人機介面 Gaze-based human-computer interaction |
Authors: | 余立強 Yu, Li Chiang |
Contributors: | 廖文宏 Liao, Wen Hung 余立強 Yu, Li Chiang |
Keywords: | 眼球追蹤 人機介面 瞳孔偵測 Eye Tracking HCI Pupil Detection |
Date: | 2008 |
Issue Date: | 2009-09-19 12:11:37 (UTC+8) |
Abstract: | 眼動儀目前的主要用途在分析使用者的觀看行為,藉以改善介面的設計,或幫助身體有缺陷但眼睛還能轉動的使用者與外界溝通。隨著相關技術的發展,眼動儀將可能如同滑鼠、鍵盤一般,成為使用者輸入裝置的選項。本論文的目的在於設計並實作低成本之穿戴式與遠距眼動儀,並將其應用於以視線軌跡為基礎的人機介面,希望能夠增進人與電腦之間的互動方式。由於眼動儀會受到雜訊與角膜反射點等的影響,本研究提出利用瞳孔周圍暗色點比例較高的特性,增加定位之準確性,以改善眼動儀之精確度,此外,頭部的移動亦會造成眼動儀在計算投射位置時之誤差,本研究也針對這個問題提出因應之解決方案。利用前述製作的眼動儀,本論文實作數個以視線軌跡為基礎的人機介面,包括視線軌跡網頁瀏覽器、強化眼睛注視照片區域、井字遊戲、互動式媒體等,並利用眼動儀記錄使用者觀看手機介面的行為。 Eye tracker, a device for measuring eye position and movements, has traditionally been used for research in human visual system, psychology and interface design. It has also served as an input device for people with disability. With recent progresses in hardware and imaging technology, it has the potential to complement, even replace popular devices such as mouse or keyboard for average users to communicate with the computer. It is the objective of this research to design and implement low-cost head-mounted and remote eye trackers and subsequently develop applications that take advantage of gaze-based interactions. Specifically, we improve the precision of the tracking result by designing a new pupil detection algorithm as well as compensating for head movement. We then present several gaze-based user interfaces, including eye-controlled web browser, attention-based photo browser, interactive game (tic-tac-toe) and media design. We also investigate the feasibility of utilizing the eye trackers to analyze and evaluate the design of mobile user interface. |
Reference: | [1] Emiliano Castellina, Faisal Razzak, Fulvio Corno, “Environmental Control Application Compliant with Cogain Guidelines,” The 5h Conference on Communication by Gaze Interaction(COGAIN 2009), 2009. [2] Jacob O. Wobbrock, James Rubinstein, Michael Sawyer, Andrew T. Duchowski, “Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures,” The 3h Conference on Communication by Gaze Interaction(COGAIN 2007), 2007. [3] Howell Istance, Aulikki Hyrskykari, Stephen Vickers, Nazmie Ali, “User Performance of Gaze-Based Interaction with On-line Virtual Communities,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008. [4] Stephen Vickers, Howell Istance, Aulikki Hyrskykari, “Selecting Commands in 3D Game Environments by Gaze Gestures,” The 5h Conference on Communication by Gaze Interaction(COGAIN 2009), 2009. [5] 王凱平, 移動式眼動儀之實作與視線軌跡分析, 政治大學資訊科學系碩士班碩士論文, 2008. [6] Droege Detlev, Schmidt Carola, Paulus Dietrich, “A Comparison of Pupil Centre Estimation,” The 4th Conference on Communication by Gaze Interaction(COGIN 2008), 2008. [7] Flavio Luiz Coutinho, Carlos Hitoshi Morimoto, "Free head Motion Eye Gaze Tracking Using a Single Camera and Multiple Light Sources," pp.171-178, XIX Brazilian Symposium on Computer Graphics and Image Processing (SIBGRAPI`06), 2006. [8] Craig Hennessey, Borna Noureddin, Peter Lawrence, “A Single Camera Eye-Gaze Tracking System with Free Head Motion,” pp.87-94, Proceedings of the 2006 symposium on Eye tracking research & applications (ETRA’06), 2006. [9] Dongheng Li, David Winfield, Derrick J. Parkhurst, “Starburst: A Hybrid Algorithm for Video-Based Eye Tracking Combining Feature-Based and Model-Based Approaches,” 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPRW`05) Workshops, pp. 79, 2005. [10] Gintautas Daunys, Nerijus Ramanauskas, “The Accuracy of Eye Tracking Using Image Processing,” pp.377-380, Proceedings of the Third Nordic Conference on Human-Computer Interaction, 2004. [11] Dongshi Xia, Zongcai Ruan, "IR Image Based Eye Gaze Estimation," vol. 1, pp.220-224, Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD 2007), 2007. [12] Sepehr Attarchi, Karim Faez, Amin Asghari, "A Fast and Accurate Iris Recognition Method Using the Complex Inversion Map and 2DPCA," pp.179-184, Seventh IEEE/ACIS International Conference on Computer and Information Science (ICIS 2008), 2008. [13] Craig Hennessey, ”Eye-Gaze Tracking With Free Head Motion,” Masters of Applied Science Thesis, University of British Columbia, 2005. [14] Somnath Dey, Debasis Samanta, “An Efficient Approach for Pupil Detection in Iris Images,” pp.382-389, 15th International Conference on Advanced Computing and Communications (ADCOM 2007), 2007. [15] Cudel Christophe, Bernet Sacha, Basset Michel. “Fast and Easy Calibration for a Head-Mounted Eye Tracker,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008. [16] Cutrell Edward, Guan Zhiwei, “An Eye Tracking Study of the Effect of Target Rank on Web Search.” Proceedings of the SIGCHI conference on Human Factors in Computing Systems, 2007. [17] Petr Novák, Tomáš Krajník, Libor Přeučil, Marcela Fejtová, Olga Štěpánková, “AI Support for a Gaze Controlled Wheelchair,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008. [18] Matt Feusner, Brian Lukoff, “Testing for Statistically Significant Differences between Groups of Scan Patterns,” pp.43-46, Proceedings of the 2008 Symposium on Eye tracking Research & Applications(ETRA’08), 2008. [19] Nguyen Van Huan, Hakil Kim, "A Novel Circle Detection Method for Iris Segmentation," cisp, vol. 3, pp.620-624, 2008 Congress on Image and Signal Processing, Vol. 3, 2008. [20] Craig Hennessey, Borna Noureddin, Peter Lawrence,“A Single Camera Eye-Gaze Tracking System with Free Head Motion,” pp.87-94, Proceedings of the 2006 Symposium on Eye tracking Research & Applications(ETRA’06), 2006. [21] Zhu Zhiwei, Ji Qiang, “Eye Gaze Tracking Under Natural Head Movements,” Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition(CVPR’05), 2005. [22] Teodora Vatahska, Maren Bennewitz, Sven Behnke, ”Feature-Based Head Pose Estimation from Images,” Proceedings of IEEE-RAS 7th International Conference on Humanoid Robots (Humanoids), 2007. [23] Skovsgaard H.T.Henrik, Hansen Paulin John, Mateo C.Julio, “How Can Tiny Buttons Be Hit Using Gaze Only?” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008). 2008. [24] Laura Cowen, “An Eye Movement Analysis of Web-Page Usability,” Unpublished Masters’ thesis, Lancaster University, UK, 2001. [25] Reeder W.Robert, Pirolli Peter, Card K.Stuart, “WebEyeMapper and WebLogger: Tools for Analyzing Eye Tracking Data Collected in Web-Use Studies,” pp.19-20, CHI `01 Extended Abstracts on Human Factors in Computing Systems, 2001. |
Description: | 碩士 國立政治大學 資訊科學學系 96753025 97 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0096753025 |
Data Type: | thesis |
Appears in Collections: | [資訊科學系] 學位論文
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|