English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 113325/144300 (79%)
Visitors : 51160500      Online Users : 940
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/32703
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/32703


    Title: 利用生理感測資料之線上情緒辨識系統
    On-line Emotion Recognition System by Physiological Signals
    Authors: 陳建家
    Chen, Jian Jia
    Contributors: 蔡子傑
    Tsai, Tzu Chieh
    陳建家
    Chen, Jian Jia
    Keywords: 情緒
    生理感測器
    演算法
    即時辨識
    智慧型生活環境
    emotion
    physiological sensors
    algorithm
    on-line recognition
    smart environment
    Date: 2008
    Issue Date: 2009-09-17 14:05:43 (UTC+8)
    Abstract: 貼心的智慧型生活環境,必須能在不同的情緒狀態提供適當服務,因此我們希望能開發出一個情緒辨識系統,透過對於形於外的生理感測資料的變化來觀察形於內的情緒狀態。
    首先我們採用國際情緒圖庫系統(IAPS: International Affective Picture System) 及維度式分析方法,透過心理實驗的操弄,收集了20位的受測者生理數值與主觀評定情緒的強度與正負向。我們提出了一個情緒辨識學習演算法,經由交叉驗證訓練出每個情緒的特徵,並藉由即時測試資料來修正情緒特徵的個人化,經由學習趨勢的評估,準確率有明顯提升。其次,我們更進一步引用了維度式與類別式情緒的轉換概念來驗證受測者主觀評定的結果。相較於相關研究實驗結果,我們在維度式上的強度與正負向辨識率有較高的表現,在類別式上的驗證我們也達到明顯區分效果。
    更重要的是,我們所實作出的系統,是搭載了無線生理感測器,使用時更具行動性,而且可即時反映情緒,提供線上智慧型服務。
    A living smart environment should be able to provide thoughtful services by considering different states of emotions. The goal of our research is to develop an emotion recognition system which can detect the internal emotion states from external varieties of physiological data.
    First we applied the dimensional analysis approach and adopted IAPS (International Affective Picture System) to manipulate psychological experiments. We collected physiological data and subjective ratings for arousal and valence from 20 subjects. We proposed an emotion recognition learning algorithm. It would extract each pattern of emotions from cross validation training and can further learn adaptively by feeding personalized testing data. We measured the learning trend of each subject. The recognition rate reveals incremental enhancement. Furthermore, we adopted a dimensional to discrete emotion transforming concept for validating the subjective rating. Compared to the experiment results of related works, our system outperforms both in dimensional and discrete analyses.
    Most importantly, the system is implemented based on wireless physiological sensors for mobile usage. This system can reflect the image of emotion states in order to provide on-line smart services.
    Reference: [1] J. LeDoux, The Emotional Brain. New York: Simon & Schuster, 1996.
    [2] P. Salovery and J. D. Mayer, “Emotional intelligence”, Imagination, Cognition and Personality, vol. 9, no. 3, pp. 185-211, 1990.
    [3] D. Goleman, Emotional Intelligence. New York: Bantam Books, 1995.
    [4] K. R. Scherer. Ch. 10: Speech and emotional states. In J. K. Darby, editor, Speech Evaluation in psychiatry, pages 189-220. Grune and Stratton, Inc., 1981.
    [5] Y. Yacoob and L.S. Davis. Recognizing human facial expressions from log image sequences using optical flow. Transactions on Pattern Recognition and Machine Intelligence, 18(6):636-642, June 1996.
    [6] I. A. Essa and A. Pentland. Facial expression recognition using a dynamic model and motion enerty. In International Conference on Computer Vision, pates 360-367, Cambridge MA, 1995, IEEE Computer Society.
    [7] R. W. Picard. Affective Computing. The MIT Press, Cambridge, MA, 1997.
    [8] Christian Martyn Jones and Tommy Troen, Biometric Valence and Arousal Recognition , ACM International Conference Proceeding Series; Vol. 251, Proceedings of the 2007 conference of the computer-human interaction special interest group (CHISIG) of Australia on Computer-human interaction.
    [9] Corinna Cortes and V. Vapnik, "Support-Vector Networks, Machine Learning, 20, 1995
    [10] Peng, H.C., Long, F., and Ding, C., "Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 27, No. 8, pp.1226-1238, 2005.
    [11] R.O. Duda and P.E. Hart. Pattern Classification and Scene Analysis. Wiley-Interscience, 1978.
    [12] Kim, K.H., Bang, S.W., Kim, S.R.: Emotion recognition system using short-term. monitoring of physiological signals. Med Biol Eng Compute 42 (2004).
    [13] Lang, P.J., Bradley, M. M. and Cuthbert, B. N.. International affective picture system (IAPS): Technical Manual and Affective Ratings, Center for Research in Psychophysiology, University of Florida (1999).
    [14] Kohavi, Ron (1995). "A study of cross-validation and bootstrap for accuracy estimation and model selection". Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence 2 (12): 1137–1143.(Morgan Kaufmann, San Mateo).
    [15] Chang, J., Luo, Y., and Su, K. 1992. GPSM: a Generalized Probabilistic Semantic Model for ambiguity resolution. In Proceedings of the 30th Annual Meeting on Association For Computational Linguistics (Newark, Delaware, June 28 - July 02, 1992). Annual Meeting of the ACL. Association for Computational Linguistics, Morristown, NJ, 177-184
    [16] Devijver, P. A., and J. Kittler, Pattern Recognition: A Statistical Approach, Prentice-Hall, London, 1982.
    [17] 徐世平 Master thesis,“Application of Physiological Signal Monitoring in Smart Living Space”, Jun,2008.
    Description: 碩士
    國立政治大學
    資訊科學學系
    95753038
    97
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0095753038
    Data Type: thesis
    Appears in Collections:[資訊科學系] 學位論文

    Files in This Item:

    File Description SizeFormat
    303801.pdf96KbAdobe PDF2931View/Open
    303802.pdf62KbAdobe PDF2908View/Open
    303803.pdf116KbAdobe PDF2995View/Open
    303804.pdf122KbAdobe PDF2926View/Open
    303805.pdf314KbAdobe PDF2979View/Open
    303806.pdf522KbAdobe PDF21060View/Open
    303807.pdf767KbAdobe PDF21020View/Open
    303808.pdf275KbAdobe PDF2947View/Open
    303809.pdf368KbAdobe PDF2913View/Open
    303810.pdf62KbAdobe PDF2792View/Open
    303811.pdf84KbAdobe PDF2853View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback