政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/37102
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  全文笔数/总笔数 : 113303/144284 (79%)
造访人次 : 50836817      在线人数 : 861
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/37102


    请使用永久网址来引用或连结此文件: https://nccur.lib.nccu.edu.tw/handle/140.119/37102


    题名: 以參數化程序產生具情緒表達能力之3D肢體動畫
    Designing Parameterized Procedures for Real-Time 3D Figure Animation with Affective Expression
    作者: 林岳黌
    Lin, Yueh Hung
    贡献者: 李蔡彥
    Li, Tsai Yen
    林岳黌
    Lin, Yueh Hung
    关键词: 程序式動畫
    人類肢體動畫
    風格動畫
    情緒動畫
    情緒表達力
    procedure animation
    humanoid bodily animation
    style animation
    emotional animation
    affective expressness
    日期: 2008
    上传时间: 2009-09-19 12:09:34 (UTC+8)
    摘要: 人或擬人生物佔眾多動畫主題的大部份,而要使人物動畫看起來栩栩如生,除了適當的臉部表情外,能夠傳達人物情感的肢體動作更是不可或缺。本研究的目標在於以電腦程序產生帶有情緒成份之人物肢體動畫。此目標包含兩個子目標:第一、設計以人類肢體動作為目標的參數化程序,以提高程序之重用性及泛用性,降低製作程序式動畫的成本,並製作出以關鍵格為基礎之人物動畫;第二、將我們製作的動畫加入風格,並以心理學實驗驗證人物的動作和情緒的關聯。我們的實驗先將風格套用於走路動作上,來證明我們操弄風格的方法符合大眾的認知。再者,我們針對走路動作嘗試以實驗找出風格和情緒的對映,再將此對映關係套用到具特定情緒意涵的動作上,以證明這些對映有助於情緒的表達。
    Human or human-like creatures are the main subjects of computer animations. In addition to facial expression, body gestures and motions are also indispensi-ble components for realistic character animation. The goal of this research is to create emotional character animation with computer procedures. This goal may contain two subgoals: first, we aim to design parameterizable animation proce-dures for human body motions in order to reduce the cost of producing key-frame based character animations with improved generality and reusability; second, we incorporate style into procedural animation and validate the relation model between emotion and motion with psychology experiments. We first ap-plied different styles into the walking motion and conducted experiments to see if the participants can agree with the way that we manipulate the style parame-ters. Furthermore, for the walking motion, we conduct experiments to find the mapping from the emotion parameters to the style parameters. Then we applied this mapping to emotion-specific motions to see if the animations perceived by the users can be further enhanced.
    參考文獻: [1] K. Amaya, “Emotion from Motion,” in Proc. of the Conference on Graphics Interface, 1996.
    [2] I. Bartenieff and D. Lewis, Body Movement: Coping with the Environment, Gordon and Breach Science Publishers, New York, 1980.
    [3] M. Brand and A. Hertzmann. “Style Machines,” in Proc. of SIGGRAPH ‘00, 2000.
    [4] A. Bruderlin and T. W. Calvert, “Goal-directed, Dynamic Animation of Human Walk-ing,” in Proc. of SIGGRAPH 1989, 1989.
    [5] A. Bruderlin and T. Calvert, “Knowledge-Driven, Interactive Animation of Human Run-ning,” in Proc. of the Conference on Graphics Interface, 1996.
    [6] A. Camurri, I. Lagerlof, and G. Volpe, “Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques,” International Jour-nal of Human-Computer Studies, 59(1): 213-225, 2003.
    [7] A. Camurri, B. Mazzarino, M. Ricchetti, R. Timmers, and G. Volpe, “Multimodal Analy-sis of Expressive Gesture in Music and Dance Performances,” in Proc. of the 5th Inter-national Workshop on Gesture and Sign Languages Based Human-Computer Interaction, 2003.
    [8] A. Camurri, M. Ricchetti, and R. Trocca, “EyesWeb - Toward Gesture and Affect Rec-ognition in Dance/Musicinteractive Systems,” in IEEE International Conf. on Multime-dia Computing and Systems, 1999.
    [9] P. F. Chen and T.Y. Li, “Generating Humanoid Lower-Body Motions with Real-time Planning,” in Proc. of Computer Graphics Workshop, 2002.
    [10] D. Chi, M. Costa, L. Zhao, and N. Badler, “The EMOTE Model for Effort and Shape,” in Proc. of SIGGRAPH ‘00, 2000.
    [11] E. Crane and M. Gross, “Motion Capture and Emotion: Affect Detection in Whole Body Movement,” in International Conf. on Affective Computing and Intelligent Interaction, 2007.
    [12] C. Darwin, The Expression of the Emotions in Man and Animals, London: John Murray, 1872.
    [13] C. Dell, A Primer for Movement Description: Using Effort-Shape and Supplementary Concepts, Dance Notation Bureau, Inc., New York, 1970.
    [14] A. Egges and N. Magnenat-Thalmann, “Emotional Communicative Body Animation for Multiple Characters,” in First International Workshop on Crowd Simulation, 2005.
    [15] A. Egges, S. Kshirsagar, and N. Magnenat-Thalmann, “Generic Personality and Emo-tion Simulation for Conversational Agents,” Computer Animation and Virtual Worlds, 15(1): 1-13, 2004.
    [16] P. Ekman, “An Argument for Basic Emotions,” Cognition and Emotion, pp. 169-200, 1992.
    [17] G. V. Glass and K. D. Hopkins, Statistical Methods in Education and Psychology (3rd edition), Allyn & Bacon, Boston, 1995
    [18] K. Grochow, S. L. Martin, A. Hertzmann, and Z. Popović, “Style-Based Inverse Kine-matics,” in Proc. of SIGGRAPH ’04, 2004.
    [19] B. Hartmann1, M. Mancini, and C. Pelachaud, “Implementing Expressive Gesture Syn-thesis for Embodied Conversational Agents,” in Gesture Workshop, 2005.
    [20] T.Y. Li, P. F. Chen, and P, Z, Huang, “Motion Planning for Humanoid Walking in a Lay-ered Environment,” in Proc. of IEEE International Conf. on Robotics and Automation, 2003.
    [21] T.Y. Li, M.Y. Liao, and J.F. Liao, “An Extensible Scripting Language for Interactive Animation in a Speech-Enabled Virtual Environment,” in Proc. of the IEEE Interna-tional Conference on Multimedia and Expo, 2004.
    [22] V. Maletic, Body, Space, Expression: The Development of Rudolf Laban’s Movement and Dance Concepts, Mouton de Gruyte, New York, 1987.
    [23] W. McCay, Gertie the Dinosaur, 1914.
    [24] H. K. M. Meeren, C. C. R. J. van Heijnsbergen, and B. de Gelder, “Rapid Perceptual In-tegration of Facial Expression and Emotional Body Language,” in Proc. of the National Academy of Sciences, 2005.
    [25] J. Montepare, E. Koff, D. Zaitchik, and M. Albert, “The Use of Body Movements and Gestures as Cues to Emotions in Younger and Older Adults,” Journal of Nonverbal Be-havior, 23(2): 133-152, 1999.
    [26] C. L. Moore and K. Yamamoto, Beyond Words: Movement Observation and Analysis, Gordon and Breach Science Publishers, New York, 1988.
    [27] V. Nayak and M. Turk, "Emotional Expression in Virtual Agents through Body Lan-guage," in International Symposium on Visual Computing, 2005.
    [28] M. Neff and E. Fiume, “Methods for Exploring Expressive Stance,” in ACM SIG-GRAPH/Eurographics Symposium on Computer Animation, 2004.
    [29] R. Parent, Computer Animation: Algorithms and Techniques, Morgan Kaufmann.
    [30] K. Perlin, “Real Time Responsive Animation with Personality,” IEEE Transactions on Visualization and Computer Graphics, 1995
    [31] H. Schlosberg, “A Scale for Judgment of Facial Expressions,” Journal of Experimental Psychology, 29: 497-510, 1941.
    [32] F. Thomas and O. Johnson, Disney Animation: The Illusion of Life, Abbeville Press, 1984.
    [33] M. Unuma, K. Anjyo, and R. Takeuchi, “Fourier Principles for Emotion-based Human Figure Animation,” in Proc. of SIGGRAPH ‘95, 1995.
    [34] R. Urtasun, P. Glardon, R. Boulic, D. Thalmann, and P. Fua. “Style-based Motion Syn-thesis,” Computer Graphics Forum, 23(4): 1-14, 2004.
    [35] H. G. Wallbott, “Bodily Expression of Emotion,” in European Journal of Social Psy-chology, 28(6): 879-896, 1998.
    [36] K. Yin, K. Loken, and M. van de Panne, “SIMBICON: Simple Biped Locomotion Con-trol,” in Proc. of SIGGRAPH ‘07, 2007.
    [37] L. Zhao and N. I. Badler, “Acquiring and Validating Motion Qualities from Live Limb Gestures,” Graphical Models, 67(1): 1-16, 2005.
    [38] EyesWeb Project, The, <http://www.infomus.dist.unige.it/eywindex.html>
    [39] H-Anim, <http://www.h-anim.org/>
    [40] IKAN(Inverse Kinematics using Analytical Methods), < http://cg.cis.upenn.edu/hms/software/ikan/ikan.html>
    [41] VRML, <http://www.w3.org/MarkUp/VRML/>
    描述: 碩士
    國立政治大學
    資訊科學學系
    93753039
    97
    資料來源: http://thesis.lib.nccu.edu.tw/record/#G0093753039
    数据类型: thesis
    显示于类别:[資訊科學系] 學位論文

    文件中的档案:

    档案 描述 大小格式浏览次数
    75303901.pdf95KbAdobe PDF2875检视/开启
    75303902.pdf117KbAdobe PDF2798检视/开启
    75303903.pdf109KbAdobe PDF2834检视/开启
    75303904.pdf289KbAdobe PDF2770检视/开启
    75303905.pdf440KbAdobe PDF21128检视/开启
    75303906.pdf342KbAdobe PDF21006检视/开启
    75303907.pdf1716KbAdobe PDF21401检视/开启
    75303908.pdf1994KbAdobe PDF21188检视/开启
    75303909.pdf2622KbAdobe PDF21008检视/开启
    75303910.pdf1457KbAdobe PDF21009检视/开启
    75303911.pdf268KbAdobe PDF2808检视/开启
    75303912.pdf312KbAdobe PDF2922检视/开启
    75303913.pdf400KbAdobe PDF2998检视/开启


    在政大典藏中所有的数据项都受到原著作权保护.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 回馈