政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/37108
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 113648/144635 (79%)
Visitors : 51672866      Online Users : 723
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/37108


    Title: 利用機器學習技術找出眼動軌跡與情緒之間的關聯性
    Authors: 潘威翰
    Contributors: 陳良弼
    蔡介立

    潘威翰
    Keywords: 眼動軌跡
    情緒識別
    行為
    隱藏式馬可夫模型
    情緒分類
    眼動掃視位移速度
    Date: 2008
    Issue Date: 2009-09-19 12:10:17 (UTC+8)
    Abstract: 目前偵測一般人情緒的方式大部分在研究人的行為,例如:臉部表情,以及分析人體的各項生理數值,例如:心跳、體溫以及呼吸頻率。然而這些研究只單純探討人的外在行為或生理訊號在不同情緒下的變化,而人的眼睛包含外在行為跟生理訊號,本研究將探討不同情緒下眼睛有什麼特別的反應。
    我們先制訂一套實驗流程,在流程中我們以不一樣的情緒圖片給予受測者刺激,然後記錄受測者的眼動反應,並且讓受測者回報自己的情緒狀態。本研究也記錄受測者在情緒刺激下的眼動反應,並將眼動之反應轉換成序列資料,再針對不同情緒下的序列建立隱藏馬可夫模型(Hidden Markov Models:HMM)。希望藉著情緒模型,從眼動行為中偵測受刺激者處於何種情緒狀態。
    本研究發現人在看圖時會依據對圖片內容的好惡,產生有意義的眼動反應。我們利用相對應的眼動反應建立情緒辨識系統,在辨識三種情緒時,辨識率能夠達到六成。
    Reference: 參考文獻
    1.Cohn and Jeffrey F. "Foundations of human computing: facial expression and emotion," Proceedings of the 8th international conference on Multimodal interfaces、1-16.(2006)
    2.Dietterich、T. G. "Machine Learning for Sequential Data: A Review," in Structural、Syntactic、and Statistical Pattern Recognition; Lecture Notes in Computer Science. Vol. 2396. New York、Springer-Verlag、15-30. (2002)
    3.Ekman、P.、Friesen、W.V. 「Facial Action Coding System,」 Consulting Psychologists Press、1978
    4.Feldman、B.L. "Discrete Emotions or Dimensions? The Role of Valence Focus and Arousal Focus," in Cognition and Emotion、Vol.12、Number 4、579-599(21). (1998)
    5.Feldman、B.L. and Russel、J.A. "The Structure of CurrentAffect: Controversies and Emerging Consensus," in Psychological Science. Vol. 8 (1). 10–14. (1999)
    6.Jones、C. M. and Troen、T. "Biometric Valence and Arousal Recognition," in the conference of the computer-human interaction special interest group (CHISIG) of Australia on Computer-human interaction: design: activities、artifacts and environments、191 – 194. (2007)
    7.Kim、K.H.、Bang、S.W. and Kim、S.R. "Emotion recognition system using short-term monitoring of physiological signals," in Medical and Biological Engineering and Computing. Vol. 42(3) Heidelberg、Springer Berlin. Heidelberg、419-427. (2004)
    8.Lang、P. J.、Bradley、M. M. and Cuthbert、B. N.、"International affective picture system (IAPS): Technical Manual and Affective Ratings," Center for Research in Psychophysiology、University of Florida 1999
    9.Nasoz、F.、Lisetti、C.L.、Alvarez、K. and Finkelstein、N.、"Emotion recognition from Physiological Signals for User Modelling of Affect," Proc of the 3rd Workshop on Affective and Attitude user Modelling、Pittsburgh、PA、(2003)
    10.Nummenmaa、L.、Hyona、J. and Calvo M. G. "Eye Movement Assessment of Selective Attentional Capture by Emotional Pictures," in Emotion. Vol. 6(2). Washington、D.C.)、American Psychological Association、257-268. (2006)
    11.Nummenmaa、L.、Hyona、J.、& Calvo、M.G. (in press). Do emotional scenes catch the eye? In K. Rayner、J. Shen、X. Bai、and G. Yan (Eds.)、Cognitive and Cultural Influences on Eye Movements. Tinjain: People’s Press eye? In K. Rayner、J. Shen、X. Bai、and G. Yan (Eds.)、Cognitive and Cultural Influences on Eye Movements. Tinjain: People’s Press.
    12.Partala、T.、Jokiniemi、M. and Surakka、V. "Pupillary Responses To Emotionally Provocative Stimuli," in symposium on Eye tracking research & applications.,123-129. (2000)
    13.Rabiner、L. R. "A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition," Proceedings of the IEEE. Vol. 77(2)、257-286.( 1989)
    14.Vyzas、E.、Picard、R.W. and Healey、J. Toward "Machine Emotional Intelligence: Analysis of Affective Physiological State," IEEE Transactions on Pattern Analysis and Machine Intelligence、23、10 (2001)
    15.Zagalo、N.、Barker、A. and Branco、V. "Story Reaction Structures to Emotion Detection," Proceedings of the 1st ACM workshop on Story representation、mechanism and context、33-38. (2004)
    Description: 碩士
    國立政治大學
    資訊科學學系
    95753020
    97
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0095753020
    Data Type: thesis
    Appears in Collections:[Department of Computer Science ] Theses

    Files in This Item:

    File Description SizeFormat
    302001.pdf96KbAdobe PDF2993View/Open
    302002.pdf311KbAdobe PDF2998View/Open
    302003.pdf299KbAdobe PDF2909View/Open
    302004.pdf332KbAdobe PDF21250View/Open
    302005.pdf302KbAdobe PDF21020View/Open
    302006.pdf548KbAdobe PDF25632View/Open
    302007.pdf1073KbAdobe PDF22530View/Open
    302008.pdf414KbAdobe PDF21179View/Open
    302009.pdf719KbAdobe PDF23328View/Open
    302010.pdf314KbAdobe PDF21070View/Open
    302011.pdf300KbAdobe PDF2922View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback