English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 113318/144297 (79%)
Visitors : 51002854      Online Users : 950
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/32682
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/32682


    Title: 分析音樂特徵尋找將情緒引導至正向之音樂
    Finding Music for Leading Personal Emotions to Positive Valence by Analyzing Music Features
    Authors: 史訓綱
    Shih, Hsum Kang
    Contributors: 陳良弼
    史訓綱
    Shih, Hsum Kang
    Keywords: 音樂
    情緒
    Date: 2006
    Issue Date: 2009-09-17 14:03:07 (UTC+8)
    Abstract: 過去有許多研究都指出,音樂具有引導人類情緒的功用。在音樂治療的理論中指出,音樂對於人的心理影響,主要在引導情緒的變化。但以往有關音樂情緒或是音樂特徵的實驗及研究,大多在自動找出音樂內容本身的情緒,以及這些情緒跟音樂本身特徵的關聯性。而本論文的目標則是希望能透過分析音樂特徵,找出能引導人的情緒往正向變化的音樂,並且研究這些音樂的特性以及相互之間的關聯性。
    所以,當我們要尋找這些能引導情緒的音樂時,我們首先必須要先定義出音樂的特徵,利用這些特徵來簡化並用來表示一首音樂。並且要能讓那些引導出相同變化的音樂,在透過這樣的特徵表示之後,相似度計算出來的結果也會顯現出彼此之間很相似。首先我們就需要透過實驗,找出每首音樂影響人情緒變化的情形。接著分析這些引導情緒至不同方向的音樂,發現到如果音樂特徵是帶有順序性時,當我們要尋找具有類似引導情緒變化的音樂時,會比較準確。
    因此,當我們藉由這些透過實驗,已知能使人產生某種情緒變化的歌曲為基準時,如果有其他更多不同的音樂,我們就能夠判斷這些音樂,有可能會使人產生何種的情緒變化。或許這樣就能夠應用在音樂治療上,提供給音樂治療師另一種選擇音樂的方法。
    Many studies have indicated that music plays a guiding role of human emotions. In the theory of music therapy, the psychological impact from music is mainly in guiding emotional changes. But, most of the past studies and experiments, which were about the characteristics and features of music, focus on identifying the emotion of music contents. In this paper, our goal is to find music that leads a person’s emotion to positive side by analyzing music features; we also hope to find out the characteristics and the relationships among music.
    Before analyzing the music, we define the music features, and use these features to express the contents and the characteristics of music. Using these features to express music, we assume that the songs that guide emotion to similar changes should be similar to each other. After analyzing the music, we found out that if the music features are sequential, the searching result will be more accurate when we want to find some songs that guide human emotions to similar changes. This finding can be used in music therapy to provide the music therapists an alternative way of choosing music.
    Reference: [1] 李侃儒, “個人化情緒/情境音樂檢索系統, ” 第五屆數位典藏技術研討會, 2006.
    [2] 陳美如主譯, 篠田知璋、加藤美知子主編, “標準音樂治療入門, ” 台北市:五南圖書公司, 2005.
    [3] 謝文傑, “音樂治療對心理與身心健康的影響,” http://www.psychpark.org/psy/music.asp , 2002.
    [4] 謝俊逢, “音樂療法-理論與方法, ” 台北市:大陸書局, 2003。
    [5] H.C. Chen and A. L. P. Chen, “A music recommendation system based on music data grouping and user interests, ” ACM Conference on Information and Knowledge Management, 2001.
    [6] T.H. Cormen, C.E. Leiserson, R.L. Rivest, and C. Stein, “Introduction to Algorithms 2nd ed. page 350-356,” Massachusetts Institute of Technology, 2003.
    [7] A. Gabrielsson and P.N. Juslin, “Emotional Expression in Music, ” in Handbook of Affective Sciences, Oxford University Press, 2003.
    [8] A. Gabrielsson and E. Lindstrom, “The Influence of Musical Structure on Emotional Expression, “ in Music and Emotion, theory and research, 2001.
    [9] J.S.R. Jang and H.R. Lee, “Hierarchical Filtering Method for Content-based Music Retrieval via Acoustic Input,” ACM Conference on Multimedia, 2001.
    [10] P.N. Juslin, “Cue utilization in communication of emotion in music performance: Relating performance to perception”, J.Experimental Psychology, 26, pp. 1797-1813, 2000.
    [11] P.N. Juslin and P. Laukka, “Communication of Emotions in Vocal Expression and Music Performance: Different Channels, Same Code?, ” Psychological Bulletin, 129 (5), pp. 770-814, 2003.
    [12] D. Liu, L. Lu, and H.J. Zhang, “Automatic Mood Detection from Acoustic Music Data, ” International Symposium on Music Information Retrieval, 2003.
    [13] S.R. Livingstone and A.R. Brown, “Dynamic Response: Real-Time Adaptation for Music Emotion, ” Proceedings of the second Australasian conference on Interactive entertainment, 2005.
    [14] N.C. Maddage, C.S. Xu, M.S. Kankanhalli, and X. Shao, “Content-based Music Structure Analysis with Applications to Music Semantics Understanding,” ACM Conference on Multimedia, 2004.
    [15] J.D. Morris and M.A. Boone, “The effects of music on emotional response, Brand Attitude, and Purchase Intent In an Emotional Advertising Condition, ” Attitude Self-Assessment Manikin, adsam.com, 1998.
    [16] G. Nagler, “Guess chords from midi binaries,” http://www.gnmidi.com/utils/midchord.zip , 1998/1999.
    [17] National Institute of Advanced Industrial Science and Technology (AIST), “RWC Music Database,” http://staff.aist.go.jp/m.goto/RWC-MDB/ , 2002.
    [18] N. Oliver and F. Flores-Mangas, “MPTrain_a mobile music and physiology-based personal trainer, ” ACM Human-Computer Interaction with Mobile devices and services, 2006.
    [19] B. Pardo and W. Birmingham, “Chordal analysis of tonal music, ” Technical Report CSETR43901, Electrical Engineering and Computer Science Department, University of Michigan, 2001.
    [20] J. Pickens and T. Crawford, “Harmonic Models for Polyphonic Music Retrieval, ” ACM Conference on Information and Knowledge Management, 2002.
    [21] J. Pickens and C. Iliopoulos, “Markov Random Fields and Maximum Entropy Modeling for Music, ” International Symposium on Music Information Retrieval, 2005.
    [22] J. Russell, “A circumplex model of affect,” Journal of Personality and Social Psychology, 39, pp. 1161-1178, 1980.
    [23] E. Schubert, “Measurement and Time Series Analysis of Emotion in Music,” University of New South Wales, 1999.
    [24] S. Stanley and L. Alison, “The Cambridge Music Guide,” Cambridge University Press, 1990.
    [25] M. Steinbach, G.. Karypis, and V. Kumar, “A Comparison of Document Clustering Techniques,” Knowledge Discovery in Data Workshop on Text Mining, 2000.
    [26] I.H. Witten and E. Frank, “Data Mining: Practical Machine Learning Tools and Technioques,” 2nd ed., Morgan Kaufmann, 2005.
    [27] Y.W. Zhu and M.S. Kankanhalli, “Music Scale Modeling for Melody Matching,” ACM Conference on Multimedia, 2003.
    Description: 碩士
    國立政治大學
    資訊科學學系
    94753019
    95
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0094753019
    Data Type: thesis
    Appears in Collections:[資訊科學系] 學位論文

    Files in This Item:

    File Description SizeFormat
    301901.pdf43KbAdobe PDF2974View/Open
    301902.pdf65KbAdobe PDF2880View/Open
    301903.pdf63KbAdobe PDF21303View/Open
    301904.pdf69KbAdobe PDF2969View/Open
    301905.pdf98KbAdobe PDF21209View/Open
    301906.pdf419KbAdobe PDF21319View/Open
    301907.pdf109KbAdobe PDF22681View/Open
    301908.pdf1033KbAdobe PDF21445View/Open
    301909.pdf85KbAdobe PDF2999View/Open
    301910.pdf57KbAdobe PDF2969View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback