Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/32682
|
Title: | 分析音樂特徵尋找將情緒引導至正向之音樂 Finding Music for Leading Personal Emotions to Positive Valence by Analyzing Music Features |
Authors: | 史訓綱 Shih, Hsum Kang |
Contributors: | 陳良弼 史訓綱 Shih, Hsum Kang |
Keywords: | 音樂 情緒 |
Date: | 2006 |
Issue Date: | 2009-09-17 14:03:07 (UTC+8) |
Abstract: | 過去有許多研究都指出,音樂具有引導人類情緒的功用。在音樂治療的理論中指出,音樂對於人的心理影響,主要在引導情緒的變化。但以往有關音樂情緒或是音樂特徵的實驗及研究,大多在自動找出音樂內容本身的情緒,以及這些情緒跟音樂本身特徵的關聯性。而本論文的目標則是希望能透過分析音樂特徵,找出能引導人的情緒往正向變化的音樂,並且研究這些音樂的特性以及相互之間的關聯性。 所以,當我們要尋找這些能引導情緒的音樂時,我們首先必須要先定義出音樂的特徵,利用這些特徵來簡化並用來表示一首音樂。並且要能讓那些引導出相同變化的音樂,在透過這樣的特徵表示之後,相似度計算出來的結果也會顯現出彼此之間很相似。首先我們就需要透過實驗,找出每首音樂影響人情緒變化的情形。接著分析這些引導情緒至不同方向的音樂,發現到如果音樂特徵是帶有順序性時,當我們要尋找具有類似引導情緒變化的音樂時,會比較準確。 因此,當我們藉由這些透過實驗,已知能使人產生某種情緒變化的歌曲為基準時,如果有其他更多不同的音樂,我們就能夠判斷這些音樂,有可能會使人產生何種的情緒變化。或許這樣就能夠應用在音樂治療上,提供給音樂治療師另一種選擇音樂的方法。 Many studies have indicated that music plays a guiding role of human emotions. In the theory of music therapy, the psychological impact from music is mainly in guiding emotional changes. But, most of the past studies and experiments, which were about the characteristics and features of music, focus on identifying the emotion of music contents. In this paper, our goal is to find music that leads a person’s emotion to positive side by analyzing music features; we also hope to find out the characteristics and the relationships among music. Before analyzing the music, we define the music features, and use these features to express the contents and the characteristics of music. Using these features to express music, we assume that the songs that guide emotion to similar changes should be similar to each other. After analyzing the music, we found out that if the music features are sequential, the searching result will be more accurate when we want to find some songs that guide human emotions to similar changes. This finding can be used in music therapy to provide the music therapists an alternative way of choosing music. |
Reference: | [1] 李侃儒, “個人化情緒/情境音樂檢索系統, ” 第五屆數位典藏技術研討會, 2006. [2] 陳美如主譯, 篠田知璋、加藤美知子主編, “標準音樂治療入門, ” 台北市:五南圖書公司, 2005. [3] 謝文傑, “音樂治療對心理與身心健康的影響,” http://www.psychpark.org/psy/music.asp , 2002. [4] 謝俊逢, “音樂療法-理論與方法, ” 台北市:大陸書局, 2003。 [5] H.C. Chen and A. L. P. Chen, “A music recommendation system based on music data grouping and user interests, ” ACM Conference on Information and Knowledge Management, 2001. [6] T.H. Cormen, C.E. Leiserson, R.L. Rivest, and C. Stein, “Introduction to Algorithms 2nd ed. page 350-356,” Massachusetts Institute of Technology, 2003. [7] A. Gabrielsson and P.N. Juslin, “Emotional Expression in Music, ” in Handbook of Affective Sciences, Oxford University Press, 2003. [8] A. Gabrielsson and E. Lindstrom, “The Influence of Musical Structure on Emotional Expression, “ in Music and Emotion, theory and research, 2001. [9] J.S.R. Jang and H.R. Lee, “Hierarchical Filtering Method for Content-based Music Retrieval via Acoustic Input,” ACM Conference on Multimedia, 2001. [10] P.N. Juslin, “Cue utilization in communication of emotion in music performance: Relating performance to perception”, J.Experimental Psychology, 26, pp. 1797-1813, 2000. [11] P.N. Juslin and P. Laukka, “Communication of Emotions in Vocal Expression and Music Performance: Different Channels, Same Code?, ” Psychological Bulletin, 129 (5), pp. 770-814, 2003. [12] D. Liu, L. Lu, and H.J. Zhang, “Automatic Mood Detection from Acoustic Music Data, ” International Symposium on Music Information Retrieval, 2003. [13] S.R. Livingstone and A.R. Brown, “Dynamic Response: Real-Time Adaptation for Music Emotion, ” Proceedings of the second Australasian conference on Interactive entertainment, 2005. [14] N.C. Maddage, C.S. Xu, M.S. Kankanhalli, and X. Shao, “Content-based Music Structure Analysis with Applications to Music Semantics Understanding,” ACM Conference on Multimedia, 2004. [15] J.D. Morris and M.A. Boone, “The effects of music on emotional response, Brand Attitude, and Purchase Intent In an Emotional Advertising Condition, ” Attitude Self-Assessment Manikin, adsam.com, 1998. [16] G. Nagler, “Guess chords from midi binaries,” http://www.gnmidi.com/utils/midchord.zip , 1998/1999. [17] National Institute of Advanced Industrial Science and Technology (AIST), “RWC Music Database,” http://staff.aist.go.jp/m.goto/RWC-MDB/ , 2002. [18] N. Oliver and F. Flores-Mangas, “MPTrain_a mobile music and physiology-based personal trainer, ” ACM Human-Computer Interaction with Mobile devices and services, 2006. [19] B. Pardo and W. Birmingham, “Chordal analysis of tonal music, ” Technical Report CSETR43901, Electrical Engineering and Computer Science Department, University of Michigan, 2001. [20] J. Pickens and T. Crawford, “Harmonic Models for Polyphonic Music Retrieval, ” ACM Conference on Information and Knowledge Management, 2002. [21] J. Pickens and C. Iliopoulos, “Markov Random Fields and Maximum Entropy Modeling for Music, ” International Symposium on Music Information Retrieval, 2005. [22] J. Russell, “A circumplex model of affect,” Journal of Personality and Social Psychology, 39, pp. 1161-1178, 1980. [23] E. Schubert, “Measurement and Time Series Analysis of Emotion in Music,” University of New South Wales, 1999. [24] S. Stanley and L. Alison, “The Cambridge Music Guide,” Cambridge University Press, 1990. [25] M. Steinbach, G.. Karypis, and V. Kumar, “A Comparison of Document Clustering Techniques,” Knowledge Discovery in Data Workshop on Text Mining, 2000. [26] I.H. Witten and E. Frank, “Data Mining: Practical Machine Learning Tools and Technioques,” 2nd ed., Morgan Kaufmann, 2005. [27] Y.W. Zhu and M.S. Kankanhalli, “Music Scale Modeling for Melody Matching,” ACM Conference on Multimedia, 2003. |
Description: | 碩士 國立政治大學 資訊科學學系 94753019 95 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0094753019 |
Data Type: | thesis |
Appears in Collections: | [資訊科學系] 學位論文
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|