Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/32693
|
Title: | 能表達音樂特徵的人體動畫自動產生機制 Automatic Generation of Human Animation for Expressing Music Features |
Authors: | 雷嘉駿 Loi, Ka Chon |
Contributors: | 李蔡彥 Li, Tsai Yen 雷嘉駿 Loi, Ka Chon |
Keywords: | 人體動畫 虛擬環境 音樂特徵 human animation virtual enivronment music features |
Date: | 2007 |
Issue Date: | 2009-09-17 14:04:22 (UTC+8) |
Abstract: | 近年來電腦計算能力的進步使得3D虛擬環境得到廣泛的應用。本研究希望能在虛擬環境中結合人體動畫和音樂的特色,以人體動畫來詮釋音樂。我們希望能設計一個智慧型的人體動作產生器,賦予虛擬人物表達音樂特徵的能力,讓動作會因為“聽到”不同的音樂而有所不同。基於人類聽覺的短暫性,系統會自動抓取音樂特徵後將音樂切割成多個片段、對每一片段獨立規劃動作並產生動畫。過去動畫與音樂相關的研究中,許多生成的動作都經由修改或重組運動資料庫中的動作。本研究分析音樂和動作之間的關係,使用程序式動畫產生法自動產生多變且適當的詮釋動作。實驗顯示本系統能通用於LOA1人體模型和MIDI音樂;此外,透過調整系統中的參數,我們能產生不同風格的動畫,以符合不同使用者偏好和不同音樂曲風的特色。 In recent years, the improvement of computing ability has contributed to the wide application of 3D virtual environment. In the thesis, we propose to combine character animation with music for music interpretation in 3D virtual environment. The system proposed in the thesis is an intelligent avatar motion generator, which generates expressive motions according to music features. The system can extract music features from input music data, segment a music into several music segments, and then plan avatar animation. In the literature, much music-related animation research uses reconstruction and modification of existing motion to compose new animations. In this work, we analyze the relationship between music and motions, and then use procedural animation to automatically generate applicable and variable motions to interpret music. Our experiments show that the system can accept LOA1 models and midi as inputs in general, and generate appropriate expressive motions by modifying parameters according to users’ preference or music style. |
Reference: | [1] M. Cardle, L. Barthe, S. Brooks, and P. Robinson, “Music Driven Motion Editing: Local Motion Transformations Guided By Music Analysis,” in Proc. of the Eurographics UK Conference, 2002. [2] P.F. Chen, and T.Y. Li, “Generating Humanoid Lower-Body Motions with Real-time Planning,” in Proc. of 2002 Computer Graphics Workshop, 2002. [3] G. Cooper, and L.B. Meyer, “The rhythmic structure of music,” in Chicago:University of Chicago Press, 1960. [4] R. DeLone, “Aspects of Twentieth-Century Music,” Englewood Cliffs, New Jersey: Prentice-Hall, Chap. 4, pages 270-301, 1975. [5] W.J. Dowling, “Scale and Contour: Two components of a theory of memory for melodies,” Psychological Review, 1978. [6] R.O. Gjerdingen, “Apparent Motion in Music?,” Music Perception, Volume 11, pages 335-370, 1994. [7] R.I. Godøy, E. Haga, and A.R. Jensenius, “Playing ‘Air Instruments’: Mimicry of Sound-producing Gestures by Novices and Experts,” in Gesture in Human-Computer Interaction and Simulation: 6th International Gesture Workshop, 2005. [8] Humanoid Animation Working Group (H-Anim). http://www.h-anim.org [9] L. Kovar, M. Gleicher, and F. Pighin, “Motion Graph,” in Proc. of ACM SIGGRAPH02, 2002. [10] C.L. Krumhansl, “Cognitive Foundations of Musical Pitch,” Psychology of Music, Volume 20, pages 180-185, 1992. [11] R. Laban, and L. Ullmann, Mastery of Movement, Princeton Book Company Pulishers, 1960. [12] E.W. Large, and J.F. Kolen, “Resonance and the perception of musical meter,” Connection Science, Volume 6, pages 177-208, 1994. [13] H.C. Lee, and I.K. Lee, “Automatic Synchronization of Background Music and Motion in Computer Animation,” Computer Graphics Forum, Volume 24, pages 353-362, 2005. [14] E. Lerdahl, and R. Jackendoff, A generative theory of tonal music, Cambridge:MIT Press, 1983. [15] M.Y. Liao, and J.F. Liao and T.Y. Li, “An Extensible Scripting Language for Interactive Animation in a Speech-Enabled Virtual Environment," in Proc. of the IEEE In-ternational Conference on Multimedia and Expo, 2004. [16] M. Mancini, and G. Castellano, “Real-time analysis and synthesis of emotional gesture expressivity,” in Proc. of the Doctoral Consortium of 2nd International Conference on Affective Computing and Intelligent Interaction, 2007. [17] S. Mishra, and J.K. Hahn, “Mapping motion to sound and music and in computer animation and VE,” in Proc. of the Pacific Graphics `95, 1995. [18] F. Multon, L. France, M.P. Cani-Gascuel, and G. Debunne, “Computer Animation of Human Walking: a Survey,” Journal of Visualization and Computer Animation, 1999. [19] J. Nakamura, T. Kaku, T. Noma, and S. Yoshida, “Automatic Background Music Generation Based on Actors ‘Emotion and Motions’,” in Proc. of the Pacific Graphics, 1993. [20] S. Oore, and Y. Akiyama, “Learning to Synthesize Arm Motion to Music By Example,” in Proc. of the 14-th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, 2006. [21] Rick Parent, Computer Animation: Algorithms and Techniques, Morgan Kaufmann Publishers, 2005. [22] Robert Rowe, Interactive Music Systems, Cambridge: MIT Press, 1993. [23] T. Shiratori, A. Nakazawa, and K. Ikeuchi, “Detecting dance motion structure through music analysis,” in Proc. of IEEE Int’l Conf. on Automatic Face and Gesture Recognition, 2004. [24] T. Shiratori, A. Nakazawa, and K. Ikeuchi, “Dancing-to-Music Character Animation,” in Computer Graphics Forum, Volume 25, pages 449-458, 2006. [25] I. Shmulevich, Y.H. Olli, E. Coyle, D.J. Povel, and K. Lemström, “Perceptual Issues in Music Pattern Recognition: Complexity of Rhythm and Key Finding,” in Proc. of AISB Symposium on Musical Creativity, 2001. [26] M. Sung, L. Kovar, and M. Gleicher, “Fast and accurate goal-directed motion synthesis for crowds,” in Proc. of the ACM SIGGRAPH / Eurographics Symposium on Computer Animation, 2005. [27] IKAN(Inverse Kinematics using Analytical Methods). http://cg.cis.upenn.edu/hms/software/ikan/ikan.html [28] L. Torresani, P. Hackney, and C. Bregler, “Learning Motion Style Synthesis from Perceptual Observations,” in Proc. of the Neural Information Processing Systems Foundation, 2006. [29] A.L. Uitdenbogerd, and J. Zobel, “Manipulation of music for melody matching,” in Proc. of ACM International Multimedia Conference, 1998. [30] B. Vines, M.M. Wanderley, R. Nuzzo, D. Levitin, and C. Krumhansl, “Performance Gestures of Musicians: What Structural and Emotional Information do they Convey?,” Gesture-Based Communication in Human-Computer Interaction, Volume 2915/2004, pages 468-478, 2004. [31] D.J. Wiley, and J.K. Hahn, “Interpolation Synthesis of Articulated Figure Motion,” IEEE Computer Graphics and Applications, Volume 17, pages 39-45, 1997. |
Description: | 碩士 國立政治大學 資訊科學學系 95753006 96 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0095753006 |
Data Type: | thesis |
Appears in Collections: | [資訊科學系] 學位論文
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|